You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/09/09 00:48:35 UTC

Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #972

See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/972/display/redirect?page=changes>

Changes:

[yifanmai] Add support for multiple inputs

[yifanmai] Add tests for multiple inputs in PTransformOverride

[yifanmai] Lint

[noreply] Merge pull request #12703 from [BEAM-10603] Add describe and cancel to


------------------------------------------
[...truncated 294.42 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 09, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 09, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-i38qfRYOLPeUtCOy0ORsJZyDxSvWDyYBLvnim54xMZQ.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-a5TFoHMTU7Z6PP244TyXp4RH4my-vbMmN5bbwfXKm7Y.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-_9DUzQM6nf2gy-BnorcOqEHHOsq9z5EOm5NmI0WzSkE.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-KTojdAYfFOSCZcl1o3FwGxu-Jpbyh9VYa5hR4TRJMK8.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests--6aPEBBmMoaXj8jFqUnXFsXHCGHKLaHRgoJfZTMJF_A.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-SsehBZhG81g6wDffrNth1SgzHtApPIVQj0_tkX37DQQ.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8688062741569456157.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2EhycPecsmjC-Ej-uFqaTETV1UBPRAKnfxGsjxqyM4I.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-zpuru1y-wcY1g4vrDPB8rx5k21X7foGhzvi_VtpiA20.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-OuP_feNu43EpN4bpSusFBM1VeQpOvm_x_7zTdxw4JY0.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-KU1MK-d4p5rSF_bRaqpsHYY07THVINpUVXym07NVUNI.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-zBj_QgLq6HEg3DMifZWIJTXCgkIBxi-Ttq2EbP1YVys.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-r1nYzYVGme9xDocFaAUWabzqHNy2r3qpqSkLF6MGs70.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Cz3GZnUY-aSg9PIFHy0ba7-WdjmenKTR8H8er_4-nGA.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-EhuAGLRBzmf_caw1uh0ps1W_qupbHiaFpc7C_b2cEUM.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-i38qfRYOLPeUtCOy0ORsJZyDxSvWDyYBLvnim54xMZQ.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-7yBdrdl02MH8hg8JrNmT1jhB3onoWJDn7wzkSEBkHzs.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-uHRMmwtmOKnCQpoa-ruqTDPhArPtEM9HmsyVD9Ok5uA.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-UQhd_z069CEElvBrE_QbSU_ZTz_qjPtqTQDAAwCUDvM.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-4SYVIUTbKCMhIOjAOrgsZ6-6lwksvsXIz4Wchd0Bzl8.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-ygkA0nGwftuahAq_AxcLJyw3SBlYNRI5f2zOVEByjX4.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-j0tongq0ce2olJqsSXm2Dp8crvppbZ16eq26c0MvViU.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-ycBX94_AtGR8C--wCSWLgtXaXFggplTqO-XNYuBhf8M.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-EY0di0mg2ixgUzguBn7ZTv6au_5_oGB-5-ZG0IZlFkE.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-LlPQXeVYznnG65AtXv0sz_60wwWl0ZGEl4yVLreU4SQ.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-TgG0Ox-WxA_E-MxQJjhA6jc87BLK5e2Lrnq_GqlZtc8.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-x2HcHi48ciKbtujFsYHuMeZp8iMD3I61bi40tofCoIA.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-fDx-FsawoRy8F6eF8ItsMJZFGYL2WgQkXYfzfPaAkoU.jar
    Sep 09, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-lAwSY-GP3YY7HspN5wDmlR-Tm4opSEeXO3bsBi9jC_Q.jar
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-pULAr_gYt6_pU3-HG-H9QV01GYekbx2tRokeu3UihTU.jar
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-9OfWeZD6bcHXjnJM_sQWOW4dxPy7mwam-0DU1mIllck.jar
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-8ofOmm3jXpRNjiUpP-Zf2IXprhXX6OXQol906IR44bg.jar
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 09, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash fddb4c588fd381e6feca16cf225c83dff90918d58f89c37d22756677a8a95c72> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-_dtMWI_Tgeb-yhbPIlyD3_kJGNWPicN9InVmd6ipXHI.pb
    Sep 09, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 09, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-08_17_45_37-1554375378502726365?project=apache-beam-testing
    Sep 09, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-08_17_45_37-1554375378502726365
    Sep 09, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-08_17_45_37-1554375378502726365
    Sep 09, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T00:45:37.216Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 09, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:47.463Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:50.567Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:50.610Z: Expanding GroupByKey operations into optimizable parts.
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:50.634Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:50.713Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:50.740Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:50.761Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:50.803Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:51.218Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:45:51.287Z: Starting 5 workers in us-central1-a...
    Sep 09, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T00:46:11.941Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 09, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:46:23.543Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 09, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:46:23.576Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 09, 2020 12:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:46:29.197Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 09, 2020 12:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:46:47.967Z: Workers have started successfully.
    Sep 09, 2020 12:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:46:47.995Z: Workers have started successfully.
    Sep 09, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:47:20.458Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:47:20.610Z: Cleaning up.
    Sep 09, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:47:20.685Z: Stopping worker pool...
    Sep 09, 2020 12:48:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:48:24.140Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 09, 2020 12:48:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T00:48:24.187Z: Worker pool stopped.
    Sep 09, 2020 12:48:33 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-08_17_45_37-1554375378502726365 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ad4a8d9c-cb1c-443d-ad70-67042537526f and timestamp: 2020-09-09T00:48:33.357000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.311

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 12:48:33 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 3 mins 11.272 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 4s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/yi3otqvokszdq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1222

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1222/display/redirect>

Changes:


------------------------------------------
[...truncated 337.41 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 10, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 10, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 10, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 10, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 10, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 10, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 10, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 10, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 10, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 10, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 10, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 10, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 10, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 10, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 10, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 10, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-lYX3l7fRYv63p64Ifu6UHFVoMFzYvN-CyJDtqnUhsSo.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-H8v95nO1fZ7WWFZ8z7IPirYrPQ_TvMZcaIrVUSCgdxo.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1681076093057328142.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-XKecnm9rLQSNoRChrSpz3-xHxzCfYY3v1w8Fx_UYKQk.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-SB3Wgr4t8MRvbS_QgpeUwbl8NGpxzeXIqjuJcgQYhec.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-vpGMqsDI_RtDWvNNbYTaZEgZG0sZ22v5RFQFGhdbBW4.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-hBWLBpR7h8N6bEf924tl9tNy15o4p_ByUSq5SJPGipo.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-Km8OL0pUKeUEHad1N2wzy654X3DHcoW6zzUlzFf4QCU.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-4ku9XRet1YPxVVgFC2VjEfCNQSBzYoOm6htZ37HPKYI.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-7x0jnODKb_cvYv-m8mP6KrHUyUhpF2Tbtl779qW3tVc.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-QbEo-2DF_aB6GeAxAopoSkaAzBxlbzIVLlOYhh67QkU.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-S-nsW01DIP_cjIIiSFi9QCukL9fKvZVrWspSYL8r1Ng.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-t4ZU0VJ2f2F-EURuqIamnTPz5urAwXJstSSeRZ3euZo.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-ZeCBfEDZ25cJbEiiYB3jvGA90y59T9OREZ4SdKuQcE8.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-DabkOO_EyY2ZwLPOd59loXsNluOFfT54MVrtr73JAcI.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-ntfXHb7KQPkEuYokehVRomRindCFs5GdH40FYV50eM8.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-6bYjNOQ_13ideNhawBdONmCvxz75yGC_fL4Vz1XmQdE.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-zrQkry6M-GKTtsIzwPIVQvuJe0ubhPcBmgtSX4X0U_A.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-WPWakea6ZBtYnBX-1qWg6Fudrd7fiqN2F3E8fCaosjg.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-WgzkUSe1bEyfl_u54Py6DEtKIZQhb1Lna1PnESEabkY.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-9K5ONrVVRBTLUBYeQOiUDV8fjV7DgS4K7F-OxjCYSlo.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-9nE8Nz8cqrtb39Op-UvYE7IMHytMI4evYy07yzY7MBI.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-OkmRQ52s4a4AIHx93jGIzS1qii2xcbk-XayUntbLuJc.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-UDWPpezP7XCHUf-6VUNlA8far1p2tp9EqgItkofZ0CQ.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-un4GGKBZFtqPl6sE8gAUb8P8P0yxhxa_-7XNcsaMLrU.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-DUNsOXy47eexZW_DyzoOv907p4ubD-XCW-lzA4zINpI.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-CXZVVRugm7e9puPWb23HTtc6RV2gndNLwQPM7rOgGng.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-T-b8PpOAPhZOg2M-GrSXBo3nZXlxQnxwBJRBw6xL2a8.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-lYX3l7fRYv63p64Ifu6UHFVoMFzYvN-CyJDtqnUhsSo.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-aJqD4-_tiMMMEUQ13RHgEY2cI_PPbyxR3HVh1dElxI0.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-HwwrPeLbBbqHjNRdK6vxGDq9O8HAGrfojwqisQkpO60.jar
    Nov 10, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-EWKtN6-Xvpd3JlwWU3-9FuzM4pl8pffuFgNkDHtWwUQ.jar
    Nov 10, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 10, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 10, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 10, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 10, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 10, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 10, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98995 bytes, hash bc9fc702422212345ce38ca5e95566e476ef8fdb68b415b21abf2b083f294822> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-vJ_HAkIiEjRc44yl6VVm5Hbvj9totBWyGr8rCD8pSCI.pb
    Nov 10, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-10_04_45_23-10760075097034053123?project=apache-beam-testing
    Nov 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-10_04_45_23-10760075097034053123
    Nov 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-10_04_45_23-10760075097034053123
    Nov 10, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-10T12:45:23.211Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 10, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:45:31.359Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 10, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:45:32.917Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 10, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:45:32.956Z: Expanding GroupByKey operations into optimizable parts.
    Nov 10, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:45:32.981Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 10, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:45:33.050Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 10, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:45:33.091Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 10, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:45:33.126Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 10, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:45:33.161Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 10, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:45:33.512Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 10, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:45:33.596Z: Starting 5 workers in us-central1-f...
    Nov 10, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:45:39.939Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 10, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:45:56.346Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Nov 10, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:45:56.377Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Nov 10, 2020 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:46:06.679Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 10, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:46:18.396Z: Workers have started successfully.
    Nov 10, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:46:18.431Z: Workers have started successfully.
    Nov 10, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:46:53.209Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 10, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:46:53.386Z: Cleaning up.
    Nov 10, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:46:53.480Z: Stopping worker pool...
    Nov 10, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:47:42.541Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 10, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T12:47:42.579Z: Worker pool stopped.
    Nov 10, 2020 12:47:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-10_04_45_23-10760075097034053123 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 057b1d93-7223-4691-8e4f-3c2ce5843b61 and timestamp: 2020-11-10T12:47:49.168000000Z:
                     Metric:                    Value:
                   read_time                    11.098
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 10, 2020 12:47:49 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 40.257 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 32s
137 actionable tasks: 96 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/h6tviid6tp2ge

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1221

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1221/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11188] Add wrappers to Go Xlang examples, and adjust front-end.


------------------------------------------
[...truncated 342.68 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 10, 2020 6:50:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 10, 2020 6:50:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 10, 2020 6:50:53 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 10, 2020 6:50:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 10, 2020 6:50:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 10, 2020 6:50:53 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 10, 2020 6:50:53 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@480540517]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 10, 2020 6:50:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 10, 2020 6:50:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 10, 2020 6:50:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 10, 2020 6:50:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 10, 2020 6:50:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 10, 2020 6:50:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 10, 2020 6:50:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 10, 2020 6:50:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 10, 2020 6:50:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 10, 2020 6:51:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 10, 2020 6:51:01 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-t77tdX7HknW3PYupI_cycQia3KQXH926gd0C6XX3u1Y.jar
    Nov 10, 2020 6:51:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-Kri6AP6PVIGBg5x8NDVsNs1UpyK0uK9y5roU3rqVb0A.jar
    Nov 10, 2020 6:51:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-rXfQ9j1pZPEnYvhWI2y71wt4McD46Q0QGzqzADP1sFc.jar
    Nov 10, 2020 6:51:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-XqwGw9Q-rinttFDzjio1z6HZuaBgPhLFNWLibECVjs8.jar
    Nov 10, 2020 6:51:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-OQ59xom716863NiymyuqSqFbdD5it5xFuikNIN-3h9I.jar
    Nov 10, 2020 6:51:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8227018754274414455.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-kuDsP7rbtfD8bjx1JYE7e1ZclDfUagRWJzEHXZwFADE.jar
    Nov 10, 2020 6:51:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-t77tdX7HknW3PYupI_cycQia3KQXH926gd0C6XX3u1Y.jar
    Nov 10, 2020 6:51:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-FVboPdPN2gc5o_WUkd2diZRAJ3nAmNR6z1hyy5Okuec.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-BWXwLFnQSfgBMptfVM-VLD-f1yCdD9TAcpEC6bKPETA.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-VyOSAV_OcD_XECxkSON5GQ0RDJYJdgHCGlS4aVxMOF0.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-U0f1KfqRkfu2Ie64UXT1DlUTqx1WZi3-wKdnZ04Hlw4.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-eiQtjUbriR_7d_9larfBQeSwzs5LXLoCSrgH4V4Tu9A.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-o77_gFjfOOetxRz5Maa4-8dVJVF7EKIzVoQLkTn_uCQ.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-vEojiemJFCebw_senW78Zcq2Lyx4RFi_dxHqCKO9CrU.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-f3A1Ghoz03pMXNnd__r_12H6fj_1rrZSdESpTSzzDU0.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-_jfmcnr3ozcuUsJLY4QA1wOE72RtybE9nEoq9_dpifA.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-sPgH6S5WJYIy1MMPecAQ7iqyZhjb0VOdmimqW5MMCPQ.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-b2R4If09O6ewqxOrORirmWqwwa4PzRbWwe43fFqL4oA.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-pt9JjZtaQpM62K09NW3TuVSrDYuqrdy0UBcN5zxJ5Ew.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-Er3TGHxNiv3BZKT91SxQCEvaQ1tMbQf7DmXUym97pSA.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-I4K_iw4E3CbPfchfdQGGqAJS2ozAGhmwoaQqsFLeXfw.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-oC9B0wlOpuAzC1_Y34VjawQIfp4pGLYC9xPClZInI0k.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-Of9ewJVZnPANcKa3J3L96KYT_mdr2MQGb1MbLXEHkWc.jar
    Nov 10, 2020 6:51:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-OGlTpepzR1mRPvAeoqzHCk0E04d8kG9F9yQjpyH2K7s.jar
    Nov 10, 2020 6:51:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-_ypN5kJOt8MmwVDWvwQ0PrTi5gmKZkGJjHIj8KMqqfc.jar
    Nov 10, 2020 6:51:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-1brtLMlzH3x3AS-md9HjWiM8jlaD48vPro61CwbwB-4.jar
    Nov 10, 2020 6:51:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-kK26OQM04ErVrBK5Nin13SwnEWlUt_35aAIDKAkNMUs.jar
    Nov 10, 2020 6:51:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-oiCDd9Rw_vliOfs15egbudL-yUCEo-8pjItBt25_QOs.jar
    Nov 10, 2020 6:51:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-Q2q4i-PE1g8QOmNlA5-R1Xb23Rs1WF-qQB8CyXewipI.jar
    Nov 10, 2020 6:51:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-YY73z5O4akLAPTRggIxEDEw1VAn2UVs8GovZZFIO2L0.jar
    Nov 10, 2020 6:51:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-XnU3IigqiRNGoxGdpN4x23AaneVRgbHf8U8jssQFxaA.jar
    Nov 10, 2020 6:51:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 4 seconds
    Nov 10, 2020 6:51:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 10, 2020 6:51:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 10, 2020 6:51:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 10, 2020 6:51:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 10, 2020 6:51:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 10, 2020 6:51:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash 5b528a2571b99f6d3563ee6877d0ed92f874f06c0ae9b441f13012661fe74400> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-W1KKJXG5n201Y-5od9Dtkvh08GwK6bRB8TASZh_nRAA.pb
    Nov 10, 2020 6:51:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 10, 2020 6:51:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-09_22_51_07-2282846260901693024?project=apache-beam-testing
    Nov 10, 2020 6:51:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-09_22_51_07-2282846260901693024
    Nov 10, 2020 6:51:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-09_22_51_07-2282846260901693024
    Nov 10, 2020 6:51:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-10T06:51:07.732Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 10, 2020 6:51:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:51:15.409Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 10, 2020 6:51:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:51:16.221Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 10, 2020 6:51:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:51:16.258Z: Expanding GroupByKey operations into optimizable parts.
    Nov 10, 2020 6:51:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:51:16.287Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 10, 2020 6:51:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:51:16.353Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 10, 2020 6:51:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:51:16.398Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 10, 2020 6:51:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:51:16.430Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 10, 2020 6:51:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:51:16.470Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 10, 2020 6:51:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:51:17.158Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 10, 2020 6:51:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:51:17.239Z: Starting 5 workers in us-central1-f...
    Nov 10, 2020 6:51:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:51:40.608Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 10, 2020 6:53:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:53:17.527Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Nov 10, 2020 6:53:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:53:17.563Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Nov 10, 2020 6:53:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:53:29.053Z: Workers have started successfully.
    Nov 10, 2020 6:53:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:53:29.115Z: Workers have started successfully.
    Nov 10, 2020 6:53:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:53:38.106Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 10, 2020 6:54:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:54:06.022Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 10, 2020 6:54:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:54:06.216Z: Cleaning up.
    Nov 10, 2020 6:54:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:54:06.304Z: Stopping worker pool...
    Nov 10, 2020 6:55:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:55:16.734Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 10, 2020 6:55:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T06:55:16.782Z: Worker pool stopped.
    Nov 10, 2020 6:55:23 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-09_22_51_07-2282846260901693024 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 72ffe483-53a6-4237-b87d-719eff9674d2 and timestamp: 2020-11-10T06:55:23.832000000Z:
                     Metric:                    Value:
                   read_time                     16.76
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 10, 2020 6:55:24 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 50.905 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 46s
137 actionable tasks: 98 executed, 39 from cache

Publishing build scan...
https://gradle.com/s/sopryvg723koy

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1220

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1220/display/redirect?page=changes>

Changes:

[samuelw] Add a test for windowed side inputs that do not have a value before the

[noreply] [BEAM-9547] Add basic support for `DataFrame.{eval,query}` (#13264)


------------------------------------------
[...truncated 340.61 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 10, 2020 12:46:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 10, 2020 12:46:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 10, 2020 12:46:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 10, 2020 12:46:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 10, 2020 12:46:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 10, 2020 12:46:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 10, 2020 12:46:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 10, 2020 12:46:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 10, 2020 12:46:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 10, 2020 12:47:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 10, 2020 12:47:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 10, 2020 12:47:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 10, 2020 12:47:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 10, 2020 12:47:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 10, 2020 12:47:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 10, 2020 12:47:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 10, 2020 12:47:03 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 10, 2020 12:47:03 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-pETbzowQDaRh-kuOZPMUiA6Zfo4jd1Jfv6MRMqsF1Y0.jar
    Nov 10, 2020 12:47:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-05ipMn-mXngh8wdrWFFqwe_vMYfxucR_8vVhPq6-qog.jar
    Nov 10, 2020 12:47:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-r14LH01SRxmLDaVn669JgOenksGr6dEVvsx9N-aJOfY.jar
    Nov 10, 2020 12:47:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-_bAG-wEj1Mx4Yo7b2OEkVn3VCkZz9xsFVhhGowMOMr8.jar
    Nov 10, 2020 12:47:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-d5_J0N78rCLVmZYKEt-dGtLC2Eu0Cyr_1EX6qOmw8Gc.jar
    Nov 10, 2020 12:47:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-0CHMtohdnqFszivkxThznozObBxbyNVzuFtq_XWThUw.jar
    Nov 10, 2020 12:47:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-7psQUDg21ONcsyIZE2lR36_HQU6lKnd8TX1ErjpvaXI.jar
    Nov 10, 2020 12:47:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-rLtI2zn5B5ayS1gpNuywKzu1K6e88GueahEpdwR7TiE.jar
    Nov 10, 2020 12:47:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-3MjrKaUBqnKEsXTIM_yA0RZCjoGWtMcCy--s_H400YI.jar
    Nov 10, 2020 12:47:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-dwiVckE2DsgewJ7W4uGaDWEQSlKN9ybNZ3ATuZheLkg.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-fhwfxFeqjM1UR3CbNth-E1brTGSqQ_8KijODfpgkfD8.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-vNuDz6EhNzrDLTCc_SMsBrJ7k815woUx-ZYlZF3AZV4.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-rh-2XdnAh5u50U38VIFCTEB_Z6pqvn1__3wtNMJKpFA.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1451872920116577282.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-0_ef3waRg0IxjVj2F2PNrQbYmjATwVKNJX4jZNHFaRI.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-vpbgfcR8PwMCLBXx3c5NBs1jWzsSbBVsetif5JfUZGU.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-5io8-BlyMHGjjoIp1AI2mlAzAEybDFGONvXSDfN8w9o.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-pLEsFiUp9p0Qgj89Sj6_sJm2Yg48jn8KlY4Kr80IqYk.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-kkoeEDdDZ23h0X_1iRUxJxvr8KTcS32hp2nTwbNI4PY.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-GxMj03G0VBgFUeTApXaYiNT-9ABNA0iEy6y_c4VmjeQ.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-AbfYGFAgXIuPDolLoq421UgCXVwx7Q6wZmD_y5Ptql8.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-ezqxkeLCR33sTk6jPboYJLOQFjzjB5lk5We4mPfYOAM.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-YNfNfKLpgF9GYe7UlK9agRy_GXAbeg2FZicZ4ugxplI.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-8LJWu67Bj-vfRmLBaAmBt7fFm9Charbtrl4cC0nlY_A.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-e0vUebyGjkw4X8QLCrIZPf7o8Zlnl9AWjl-4PW67uZ0.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-iPG87ZmP_pfL76orCyqbksHZVcsGkIDVBFEa217ehdI.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded--DjrLaekHVR0OlR_CRzOkRG5Sd9_5cqmwmI19a6Flvg.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-5Z3gz3mv7rLa_CM6Byc4cTaYUeGWS68xJfvlvDNgYJg.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-pETbzowQDaRh-kuOZPMUiA6Zfo4jd1Jfv6MRMqsF1Y0.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-v7POcAvMSpuerCPud_h9oO6p_0nJT_0u8XssEpdSRjI.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-lzES1YY-vlAa0iaq1Mv8VgGqGZA2RkK_4KKhBGaN25k.jar
    Nov 10, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-VM4j2Vc6Swc8UmmhPnbhPqc2EsuZ5za3Z0H3D2a1EzU.jar
    Nov 10, 2020 12:47:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 10, 2020 12:47:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 10, 2020 12:47:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 10, 2020 12:47:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 10, 2020 12:47:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 10, 2020 12:47:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 10, 2020 12:47:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash 5d5e99ac6981f0a7396ee4be15a6bd834217bd56c4c1d42d785b8fcbf28d1fcc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-XV6ZrGmB8Kc5buS-Faa9g0IXvVbEwdQteFuPy_KNH8w.pb
    Nov 10, 2020 12:47:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 10, 2020 12:47:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-09_16_47_05-8617383239945488962?project=apache-beam-testing
    Nov 10, 2020 12:47:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-09_16_47_05-8617383239945488962
    Nov 10, 2020 12:47:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-09_16_47_05-8617383239945488962
    Nov 10, 2020 12:47:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-10T00:47:05.651Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 10, 2020 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:47:15.122Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 10, 2020 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:47:15.957Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 10, 2020 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:47:16.012Z: Expanding GroupByKey operations into optimizable parts.
    Nov 10, 2020 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:47:16.073Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 10, 2020 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:47:16.151Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 10, 2020 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:47:16.183Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 10, 2020 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:47:16.215Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 10, 2020 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:47:16.253Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 10, 2020 12:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:47:16.783Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 10, 2020 12:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:47:16.871Z: Starting 5 workers in us-central1-f...
    Nov 10, 2020 12:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:47:44.350Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Nov 10, 2020 12:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:47:44.390Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Nov 10, 2020 12:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:47:44.473Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 10, 2020 12:47:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:47:54.652Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 10, 2020 12:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:48:03.063Z: Workers have started successfully.
    Nov 10, 2020 12:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:48:03.119Z: Workers have started successfully.
    Nov 10, 2020 12:48:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:48:39.203Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 10, 2020 12:48:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:48:39.366Z: Cleaning up.
    Nov 10, 2020 12:48:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:48:39.470Z: Stopping worker pool...
    Nov 10, 2020 12:49:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:49:28.829Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 10, 2020 12:49:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-10T00:49:28.875Z: Worker pool stopped.
    Nov 10, 2020 12:49:35 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-09_16_47_05-8617383239945488962 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2eec606c-ca28-40fb-8a29-4efee33c1e53 and timestamp: 2020-11-10T00:49:35.201000000Z:
                     Metric:                    Value:
                   read_time                    14.298
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 10, 2020 12:49:35 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 43.885 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 3s
137 actionable tasks: 100 executed, 37 from cache

Publishing build scan...
https://gradle.com/s/gsnio233ska2w

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1219

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1219/display/redirect?page=changes>

Changes:

[piotr.szuberski] [BEAM-8719 BEAM-8768 BEAM-8769 BEAM-8770 BEAM-8771] Update minor hadoop

[noreply] [BEAM-10892] Remove redundant asterisk from the Kafka external table doc


------------------------------------------
[...truncated 348.83 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 09, 2020 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 09, 2020 6:47:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-Q4TdUyn1HRhenQSyQcT7G_xnSC_QvNk3PUzHGFZoC1c.jar
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-6vAe-gt64zozvBvpyaMdk9soKY4ZtM9hkFX4Z09areg.jar
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2878132730206877224.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-l2yopR4lOsWvU5hdWD09uE54p-1_oRuqIu7QIVjpvAQ.jar
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-X84j9gX4XMxVjb2Gb-9hu-1Eirha4VABnJPnY4ydHIs.jar
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-hcsgU1UaVV4tJrGsxj_yB7I0TVk0z31zQc9YTzAiUaE.jar
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-OjzZzZeTSFI2d3JQH4iQuEoyMUF5uDLuGBkN10xBA8E.jar
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-dZLXeYAhVO2CNqFTXBoNptllB2o7p4lir2Aw4PFp-sc.jar
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-8v7m5MfDnF1XAyy8uTJminckAvp5DQdZiYen8BUYV7Y.jar
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-CqOelswY1AJjpeuVZWUUnM8V87w4TPdE-L_GjcIZdjA.jar
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-8Pe6Tyr324M4GKxBSyen_CbwYLlQorlkdPY7w1ku05E.jar
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-cOqGmqMTAWdGBSmYZA5GpEGoicaJevg7UkG0IFJYmsU.jar
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-0OpeI4XEFlrLjSz69cE4JaFQk9S-f3TmV7bMf2gTL2E.jar
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-lDVJkJ9Ekf7SjrGmn29GqXfflWyFl8xiv0t5kIMqIxM.jar
    Nov 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-RlzzdfjyLkzst0wRx-96QLPgNzGRwRR0zrYnNwFnPsg.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-m72gmDamp_fiOMUhcmXTeFUBeP_lPQF2F3CypsHUe8Q.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-5tooM4ZLp3woWe12z468VyHrPYo9NwBW9fUPszU_4K8.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-4zJkZRZrCCN81gc61592aL0ODhzUs8HpX6GuRdMPDlk.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-rDTUEXkL61KwrjmGdnzB5cFC2jEvaHzQizDMCBqMdZ8.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-hG85Prkirk5ZArEctqK7JGrdSfGyDcVNmJM9GfCcdF8.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-PNrIpGnX5ex3g2HVDaEXZXp5raSMRvBIegBpqABl3i8.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-OcEeupIspV4mz6ev0nRY2i_DyfPdWq2KSTVqEpJXK0w.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-LjY3Hn7EMQXezXtDTXK9HPDU_DN1A9SfhmQBFe9KAa0.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-GVs0osLnKYdrHor1BcmcXyuaDTBDR4vUIEVplbBeT6c.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-E1CrxjIgV5IRAdLpNkuh3X1YdY4Ltpb6s8yIvDsQ_tQ.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-0pWoIuaGLo_fYlB1pdrYpAisufKCHdubLXpAvKws98A.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-9rq_s3vUgMXRW1oKr0tvVpxqOF9ba4ghiF3xvQrl4ZU.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-Q4TdUyn1HRhenQSyQcT7G_xnSC_QvNk3PUzHGFZoC1c.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-xOasSwnFR1KOIZjRSgtb_oXyIEG8nJuCLrSPYpqG_ng.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-KxpckzJlOBa4AqHP11mFY5RA09a37UhhIKNHmCee4ao.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-lMDb--eZf4UMVYF1pKsIPQ4rBujVIVVmgSoT6EHpfCk.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-sSUOg8tnRYV1CQaIBg8yvgdNZDAJkPD7L-wAIBPTLG4.jar
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 0 seconds
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash 723d006fa8e82bc5c7dfda9347cd17fd30313d093c5945ad6d3bbca40ef3775f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cj0Ab6joK8XH39qTR80X_TAxPQk8WUWtbTu8pA7zd18.pb
    Nov 09, 2020 6:47:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 09, 2020 6:47:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-09_10_47_22-1081006294565921934?project=apache-beam-testing
    Nov 09, 2020 6:47:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-09_10_47_22-1081006294565921934
    Nov 09, 2020 6:47:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-09_10_47_22-1081006294565921934
    Nov 09, 2020 6:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-09T18:47:22.270Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 09, 2020 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:47:30.770Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 09, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:47:31.490Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 09, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:47:31.545Z: Expanding GroupByKey operations into optimizable parts.
    Nov 09, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:47:31.573Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 09, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:47:31.656Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 09, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:47:31.692Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 09, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:47:31.719Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 09, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:47:31.759Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 09, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:47:32.172Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 09, 2020 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:47:32.256Z: Starting 5 workers in us-central1-f...
    Nov 09, 2020 6:47:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:47:56.776Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 09, 2020 6:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:48:20.805Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Nov 09, 2020 6:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:48:20.841Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Nov 09, 2020 6:48:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:48:31.147Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 09, 2020 6:48:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:48:37.953Z: Workers have started successfully.
    Nov 09, 2020 6:48:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:48:37.976Z: Workers have started successfully.
    Nov 09, 2020 6:49:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:49:10.221Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 09, 2020 6:49:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:49:10.504Z: Cleaning up.
    Nov 09, 2020 6:49:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:49:10.650Z: Stopping worker pool...
    Nov 09, 2020 6:50:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:50:12.615Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 09, 2020 6:50:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T18:50:12.663Z: Worker pool stopped.
    Nov 09, 2020 6:50:20 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-09_10_47_22-1081006294565921934 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8f895a82-7d65-4a8b-a465-637fb363b87e and timestamp: 2020-11-09T18:50:20.691000000Z:
                     Metric:                    Value:
                   read_time                    11.454
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 09, 2020 6:50:21 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 3 mins 13.918 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 4s
137 actionable tasks: 103 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/y3tjqymgfkr2c

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1218

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1218/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11075] Go SDK's synthetic source supports hot keys generation


------------------------------------------
[...truncated 335.63 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 09, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 09, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 09, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 09, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 09, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 09, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 09, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 09, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 09, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 09, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 09, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 09, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-sugueavAurpSvSXZfLQ-affN3MVwCdZtMxNbfN319ck.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-CF3EecjodPk63qZfDkFaeig7WZ43nQNNtP0k7mLnZk0.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-5jUDLOJv7v709oXjcPye1h9piKsEtn07resRX-oiooI.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test944225967325411379.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-cNODGn1SOUzj776YoopnkUx3uvCI9ZQOiZqDohBnT0k.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-hk0XQ98t4GaOFTMlgkDnLE3zDdsbDQ9nyfYfX96nwds.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-II7WAAh_ulS0RpyK39LsgxeSd4SNdQiYQZf7RjSPzAw.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-mDYig2owCdxy8jljqCmInzJKu3gJs5HkwyfygPH7a9s.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-3JIXHIw295mK-aSwqvtizY92z9HdsXUyo0UCCq3DTk4.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-bQhFGmiuZc-xRfVTrK_o7oR2JKemQxUIFrmBY9vD3Yw.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-sJK60c0YqIHQ8titUriDybvS8f2w3sgw6if6TnBy_Kk.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-scOmQHu2GYNto6HgtbD6zXyPXAtjwgFVd2ytqfqNDsQ.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-6GuVlBwtVlTKBdnkmDqJznFH1QHMdtMsco-uPM6aYDA.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-Eb-sSQBarKFJjlhmkGb_UXWiMKyKes4rfoFEjBnv2eA.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-2MfdkPDskwrZMjCL85lipFxMiSq_eU28YILHsnIyGRo.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-Hw-lt9auzxA353Yl35zYRx511u1hPKDtzP88BW-zlQk.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-LuSKr3_1lG5z4CJiol9fQhc372N3-Cd4eq9JsW30D1U.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-wOgjbohsC9oQ_vq9Q37ZyF3GlEKxYsxACMhlDCEOYoM.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-l5519k8NjgAIBDSzFoOJ4XXTgYAK6TfTiMUmqP1vxhY.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-AdxPHJZYurTFHNJzSnr7zaSH0oxK2ScWKny9QSpejQo.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-el4VNKjV7sIqYMv38kbOkT4VV3eYcAMbe_rqFMs1o2o.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-tzmoG4HTKKT1496ey5nA0ayMdkKqZJpOCVxD86oqG-4.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-iKP6zW4i7gXe6XCaP81F_WqKaH1fdKB_a11f7PfsmBo.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-sugueavAurpSvSXZfLQ-affN3MVwCdZtMxNbfN319ck.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-ZoJQ2F6B3wsM5veNL8tEcjRZ0-IIh3aQSPaCj09cRA8.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-e7MsKU36FBDLmZvlaAAeXkXTl2ZwANMNxqWMAOLhYPU.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-2nRdoAieCPcXr9jwTZha58nm69mGYGXTJv5zxG17B6Q.jar
    Nov 09, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-5FwKVs40TvHWYerNRgm-6rwOL7F0gBMoMegCLpb4U-k.jar
    Nov 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests--wm8XemW4F6TsxXuLu57eIxMo_WjDNy-Rn_2FWf_Njo.jar
    Nov 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-2bwuGmoKPn86MmlullrEBYEBLmGcuOJfUAId0-nyDFk.jar
    Nov 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-OBEljTwU0WiF67lGtMPIjOSxyDJu9-Ol-it5HgIIWP8.jar
    Nov 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-gCrdYg3EvEOf7Tvw4mcBV5pugzQIrlFtLcHfLXP2mR8.jar
    Nov 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 0 seconds
    Nov 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98995 bytes, hash 4f4b2b290a2bc55552b51aba102dfc82c0ac04dcf81484f9f9ffd2a453b8cc5d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-T0srKQorxVVStRq6EC38gsCsBNz4FIT5-f_SpFO4zF0.pb
    Nov 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 09, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-09_04_45_23-13734622344332834449?project=apache-beam-testing
    Nov 09, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-09_04_45_23-13734622344332834449
    Nov 09, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-09_04_45_23-13734622344332834449
    Nov 09, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-09T12:45:23.172Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 09, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:45:30.866Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 09, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:45:31.438Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 09, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:45:31.478Z: Expanding GroupByKey operations into optimizable parts.
    Nov 09, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:45:31.508Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 09, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:45:31.572Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 09, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:45:31.624Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 09, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:45:31.665Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 09, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:45:31.700Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 09, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:45:32.125Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 09, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:45:32.204Z: Starting 5 workers in us-central1-f...
    Nov 09, 2020 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:45:46.748Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 09, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:45:57.721Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Nov 09, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:45:57.762Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Nov 09, 2020 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:46:08.071Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 09, 2020 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:46:17.298Z: Workers have started successfully.
    Nov 09, 2020 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:46:17.327Z: Workers have started successfully.
    Nov 09, 2020 12:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:47:00.966Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 09, 2020 12:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:47:01.211Z: Cleaning up.
    Nov 09, 2020 12:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:47:01.279Z: Stopping worker pool...
    Nov 09, 2020 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:47:49.717Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 09, 2020 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T12:47:49.762Z: Worker pool stopped.
    Nov 09, 2020 12:47:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-09_04_45_23-13734622344332834449 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 300848b8-7906-4849-9b46-1decd6a3dfcd and timestamp: 2020-11-09T12:47:55.089000000Z:
                     Metric:                    Value:
                   read_time                    22.703
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 09, 2020 12:47:55 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 46.233 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 38s
137 actionable tasks: 96 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/75u3mj6t25dhc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1217

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1217/display/redirect>

Changes:


------------------------------------------
[...truncated 336.63 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 09, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 09, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 09, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 09, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 09, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@209978778]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 09, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 09, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 09, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 09, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 09, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 09, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 09, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-BuOUVRyeXwr6t-ZMTUWpdy0DX8YNw_5ZYpaY_K4ahno.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-Kp80gDOSCk0T2Mo_eH8Kc2D9B1VYGjJBwhwgiovfpBE.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-8FqQ9BgC9vSuyl4RsJSeSjz7ETFgPu7TxXzq8L64dis.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2018630144841955290.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-bz3pfp_XQpfY2PVmFTELmYoVW07BdzgMkz4BiFyPCyc.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-dfKOohXRoXoanvBndPeQo4S_cM_ZLZ5EAdJWg-2yV5c.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-atlFsn7hq-Y_F6NeIWondugzMbVKzbirZQ3WUDZWres.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-g5egmCLWdAZ6SpucvBuBo5YKj5egp27jlkuk8gvyCsQ.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-g8Flvx9JahQAWzHp4NZna4YF-QkPYZWczQxRDJO5iqU.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-1qPvdHGwX0uyHR-zyMUssjyXceSVFTmGXNaeq2P4_eo.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-f8pHkVcH1oOk5wd_5LCSPhjD7U7Vo4ZkmVpz28pmRcQ.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-f76CIxehH_0IjtDU6BPtWaU14cjQL7iPvbylw9S_aQA.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-ALqmAOe_0qQPKZcRTWZ2dlJdr8S25-6dmaDlkzd9EaM.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-vu-ip1IzErkEE1bKAkhzfwRUJ0V2FVYuUNNC1BqBV4k.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-1vPBA7l-72PKeNtNKQPPLK9QLt-uct4PEXxPdAO5DuY.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-lZQHx2ww3_SPRlAiqO4XkCvFTOZk4haV7-FW8kyN6HM.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-Ry4uJzopUBvI3aINcUwn6F_TXtj4dHCG7MPggRSt1js.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-Jrl2ib_qrZsY73WcMDsf7MsXVDRh1va8XcTvb6m11Ik.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-077qxCQeXSn7nKCkPdT9FMAhckoeStwO04_-PgtGHQ0.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-eD4qtLxH202XBwNHmLzX1R98QXrRIrTGePnm9QLemHo.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-TmrK89ovLKB2XKqGejkU_LiBlIaekE5Yau85weYgMFs.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-VQV36KS-_Eg2HNuhVrn_PBwF2vt2fHrZkvbjDORatOY.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-BuOUVRyeXwr6t-ZMTUWpdy0DX8YNw_5ZYpaY_K4ahno.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-cJ3bD4zUfeWZmHeP9AKutNn5dsPmLYkmiRHQuqMeS-w.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-Jh1pExNP_dnVKzFwDSge_Hzg5Y6cEtiAuvyeqYjtz-8.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-ZulWsfbKFnNlpT4x6dKQIUVMYxUaSkcG2ZlcJeXWmeA.jar
    Nov 09, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-NfHxNVjArxRapAc-sFluaubTGv-ajVI437aQ-1oaEGw.jar
    Nov 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-jClNGeJdFbpDCOSvtgc9aA6jdLuC_4S83JQLvrZBIsA.jar
    Nov 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-yyAC4CfB6lkxb0v_bBa5LPjNJLVx11r5edZI6HdaQuI.jar
    Nov 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-pd95TqWSV-HtAVzYc_ok-1apbvmbHxJob73NNbw3gy4.jar
    Nov 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-5ut528KAfGyyU1UUrTlGjrfW2kv0Dtl1gAurDzqRS9A.jar
    Nov 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-jY6GNygcvs6wlABwLSVFCY0yJ--gSRZ_52KMjNOyMZw.jar
    Nov 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 0 seconds
    Nov 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 09, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98992 bytes, hash 6b84ca18924dff186829d8b1e37b2ad846a78f49e1ac1f636297dcc9b36d59cc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-a4TKGJJN_xhoKdix43sq2Eanj0nhrB9jYpfcybNtWcw.pb
    Nov 09, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 09, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-08_22_45_20-3248158691474379265?project=apache-beam-testing
    Nov 09, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-08_22_45_20-3248158691474379265
    Nov 09, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-08_22_45_20-3248158691474379265
    Nov 09, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-09T06:45:20.262Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 09, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:45:28.475Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 09, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:45:28.994Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 09, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:45:29.033Z: Expanding GroupByKey operations into optimizable parts.
    Nov 09, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:45:29.061Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 09, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:45:29.136Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 09, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:45:29.166Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 09, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:45:29.198Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 09, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:45:29.232Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 09, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:45:29.784Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 09, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:45:29.858Z: Starting 5 workers in us-central1-f...
    Nov 09, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:45:55.616Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 09, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:45:55.730Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 09, 2020 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:46:16.879Z: Workers have started successfully.
    Nov 09, 2020 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:46:16.899Z: Workers have started successfully.
    Nov 09, 2020 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:46:53.469Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 09, 2020 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:46:53.632Z: Cleaning up.
    Nov 09, 2020 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:46:53.710Z: Stopping worker pool...
    Nov 09, 2020 6:47:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:47:38.129Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 09, 2020 6:47:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T06:47:38.179Z: Worker pool stopped.
    Nov 09, 2020 6:47:44 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-08_22_45_20-3248158691474379265 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b8a1609b-891a-4d19-abbd-6924a779da28 and timestamp: 2020-11-09T06:47:44.551000000Z:
                     Metric:                    Value:
                   read_time                    16.593
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 09, 2020 6:47:44 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 37.948 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 29s
137 actionable tasks: 96 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/ine4dhtebuoju

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1216

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1216/display/redirect>

Changes:


------------------------------------------
[...truncated 336.27 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 09, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 09, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 09, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 09, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 09, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 09, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 09, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 09, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 09, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 09, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 09, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 09, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 09, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-MpBxvdgUFVql_ovNAvmsPBoYSzplQFHh1Vyyby4-w1M.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-LkDFA5F2LGP-yeBorI6fmAEuBcM_H07r3K-7F26C1nA.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-xswUq2TyF59eXNnt87RXzyhYGBgwlX_riYKSUZBR92s.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT--A5AOa-k_WT2A4mxyAMxXORz1Ogeb_ekVO1ySAqb0Kc.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2894458968652908456.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Im6txMFhAqfpRTFN-It1Xb2tWRdvWkYMs9zZBdzHT1M.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-esOwoMc3lpzkNWIVrIjDm2rSLcSSE7QPUuq1SwWy2YI.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-SQhKJbLxEdSz7S732KRtB0dKnjaHFMayQjgiRAXrUbQ.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-t9JoEvn4s2T2ks7Yy6VY253woqm7wqXyUC6mKnhtFmQ.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-BuDdybPnRomf-NF_cv5MyhkrjIxKLiaopywn3mbTptU.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-o2hBQ0EeCOUAtRRINotTzqYNLvJLT6w6ySgkA8y-xQY.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-xDLVa4v8HjxmU8wqD0ZGix_nwk9AFTHzCMIaGCqfqNk.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-qhxeSKronFbWMH77Jhzfvky-nPIdkAEZNsbgbUKlFxM.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-QfvQaO8CNPhkhrFUh2ByLcPJaUvPf-2otYpYrEuVr4s.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-YSNV3wuDZckGQuZPpfoyGdQ63zjJY6jkkgbXGsqmv7M.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-VqsJjtNJ66NMRWuH-5wp2GXiXUSBID8AvS7uhyXOAtg.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT--52vDyZPdsBJCGvkcRsIQNHhAoguJKDIOi07CK_P2FE.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-LkDFA5F2LGP-yeBorI6fmAEuBcM_H07r3K-7F26C1nA.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-jxeSk-czqHV0AUaROzYQ6mV5hOjT4afEgrEKuTnjyo0.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-QSbFgp9zzoP2263SfShPLHuoMIncHbZ_lMX34tgXGOQ.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-FUUMTPxJbi7_ObGGTu8-kWc9u1diCV1xQSMhqRoHlB4.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-ZF6PSOW1lVaviAkMAOkkotB1w4jNgu4G9x8BqI98IOg.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-7h4ih0WLpbfpuzwXlLfLCuAwGwdSlCjPbj4U06CeyjU.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-TXYiVyOudvB3Czvw-_YcExFVU43TE09LnpQmUPe6y6Y.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-BsRGVrZpvbIA9mlIW5nSnLwnXbNQnBI97FjYz0VZjy4.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-OXqFAluZriIZ1nnkmNB7nVWNys3QG0CrJEw7fm81s4k.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-BabkJycEaUEFTLqo4SqBgAwV2KoiK3PEyOHkcPk_DRA.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-WUVcQ1OO4pTlHRSyAAdsB4ja_Fhgv9C8REeCE6UCyw8.jar
    Nov 09, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-vKt4r8-2hIdzNm5hnePn3QBbRvj-8pV5BFH4RbFdZkA.jar
    Nov 09, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-iwECI4UxxpzhyGgp-Ljt62FrduoQUbLN9mG4-eqkHOU.jar
    Nov 09, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-5d614oTRt9MJHxfhd0cirIC-Oyqtz9CXea7bFk3ZF2U.jar
    Nov 09, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-kLUk4T58HNXb3aI6bGTxkGU9UzBA5jC2v2AQ7fSjPXw.jar
    Nov 09, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 09, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 09, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 09, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 09, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 09, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 09, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash 43874502a3df9e796d4de7cfae53eb3de84bb25023e2eb280fde7110ea947105> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Q4dFAqPfnnltTefPrlPrPehLslAj4usoD95xEOqUcQU.pb
    Nov 09, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 09, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-08_16_45_22-12345264395346931206?project=apache-beam-testing
    Nov 09, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-08_16_45_22-12345264395346931206
    Nov 09, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-08_16_45_22-12345264395346931206
    Nov 09, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-09T00:45:22.233Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 09, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:45:31.435Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 09, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:45:32.727Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 09, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:45:32.814Z: Expanding GroupByKey operations into optimizable parts.
    Nov 09, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:45:32.875Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 09, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:45:32.990Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 09, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:45:33.043Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 09, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:45:33.087Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 09, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:45:33.110Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 09, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:45:33.506Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 09, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:45:33.589Z: Starting 5 workers in us-central1-f...
    Nov 09, 2020 12:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:46:00.313Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 09, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:46:10.319Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 09, 2020 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:46:21.241Z: Workers have started successfully.
    Nov 09, 2020 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:46:21.310Z: Workers have started successfully.
    Nov 09, 2020 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:46:59.060Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 09, 2020 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:46:59.255Z: Cleaning up.
    Nov 09, 2020 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:46:59.381Z: Stopping worker pool...
    Nov 09, 2020 12:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:47:55.738Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 09, 2020 12:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-09T00:47:55.794Z: Worker pool stopped.
    Nov 09, 2020 12:48:01 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-08_16_45_22-12345264395346931206 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4861fd25-2dfe-47a4-9ed6-fc2cf6989926 and timestamp: 2020-11-09T00:48:01.507000000Z:
                     Metric:                    Value:
                   read_time                     17.51
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 09, 2020 12:48:01 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 53.484 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 46s
137 actionable tasks: 96 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/nwpl5whotc7yq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1215

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1215/display/redirect>

Changes:


------------------------------------------
[...truncated 339.31 KB...]
    INFO: BigQuery method is set to: DEFAULT
    Nov 08, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 08, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 08, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 08, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 08, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 08, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 08, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 08, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 08, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 08, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 08, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-NXC3ur0LsDN23c7p0rgHSkWJ0HrFKKKhMaHLx7M4CZ0.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-HtbwnojdzB2V98xZufxe1f-8SvXGZUUO9NMhwzBrhsg.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-FRmoxp4dytGc-IhMXxab4ACFPMfaRUbHp7bkWDa195c.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-FQCCvHw5oYZQl18ucXFFs_bFWHC0hD97qk-YWw-NIXI.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-lxwYeIioE7JRscuISpSHLOv0pQnoC6waanoY06LBfCA.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-uTzd9LYv0dwysJ77sdRyhO1P2YodLywBoUVWmhhoqfo.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-hcQCPxr6UN_wj84kK90xnNfKUG73Vz2pi_zfIMYtu2A.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-SzmBhsB62jNC3cBTu_8Au0fs-bRUG6hC5s2DcwZLZ00.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-_0r32GPF79O_tgd-CyyFEbV23_BPoJNwLdu9haQsu3k.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8818665456441535892.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-MrrGrAmmZ5yF0xCZzLmfPuy1lAGCrkOtH_gOl6fccDM.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-TUMGKSJXtW2VpmeFxG7z_n5fQumSE44mDBSJVTUyD6o.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-dHEuTGSLXGQcvnwvqKUSfu417A9tD-UGmKQmnwXSKHE.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-dhAP8oJbJNmV6UXfdk0vymSaOEu4NMoD1z-qY6hEA14.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-ENbQ5a5ew5IYzRF9LHc6r7wjikc4rXX-yuo24iWo1Ik.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-d1cBW3OgHc5Kft6b0ivHSA7PHDnrRTqU9zayz6XnRuM.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-xRzmXNQSYTsV9xHG1RQ-ES3kP1Ib7M-ncb5DiUc9Yss.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-RZEywbjD1oreRIcHFXjy5ajiHBBWqcXEyTftTnWBaJw.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-Uoaie9tNfvS4rnOQEfw9R3coo-gdB5w_DdIPs-ouNew.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-aq6CK_ojE_3B7Cl02GTJpRLgRMZk2BHQ1Lr7RFCUmRE.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-zELA5oWDy6xMfRx1rQmI-vh9hp6SslN-OnPKe9U9N78.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-mNd0DuNDeHqst9Mdn0d01yj2yFbTiCy1K4wrN-FYcGE.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-6VVHr4V4wAKjJ4P2X96VE0t03oNwmbjRWLXjf_AU3H8.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-tRJ0Uj9h9fFLK63k1uWx3iT2gV9k-7idmaW44CUOU_c.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-qpk_9X5zlnPWkz8DvQnkI3AMJrszgpyQA0TVWmH6mhQ.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-0Sv2CLDPBxUd5MlNJ4fwgdJlHBNIIhj05G6HAQu4OeA.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-wNlDoGRyjxb5IEFbT3fcktV1XyFOSg3QgQXt8kFYB8M.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-uG4kHmC21agjpgDXf-GQ4R9S-lLhI4LKKldeYeyUDso.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-NXC3ur0LsDN23c7p0rgHSkWJ0HrFKKKhMaHLx7M4CZ0.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-2B8jhRLiIEreqX3LPnatye88rqHPAfuP2IZjmpCQTos.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-ryNeQ_T4NoMgxE_PP3Zl8htDAWECqNpmYE5xtrL0lDg.jar
    Nov 08, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-78BOKybkQzximSlK0fAgVYmKEFmo3siYX1i_0kS_KMk.jar
    Nov 08, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 08, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 08, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 08, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 08, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 08, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 08, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98994 bytes, hash c279c2cf9b34feb120984149e5790e77732e3ca852f53638f34003c8eefc7b83> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-wnnCz5s0_rEgmEFJ5XkOd3MuPKhS9TY480ADyO78e4M.pb
    Nov 08, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 08, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-08_10_45_20-3176507254190115788?project=apache-beam-testing
    Nov 08, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-08_10_45_20-3176507254190115788
    Nov 08, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-08_10_45_20-3176507254190115788
    Nov 08, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-08T18:45:20.222Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 08, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:45:29.449Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 08, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:45:30.071Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 08, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:45:30.123Z: Expanding GroupByKey operations into optimizable parts.
    Nov 08, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:45:30.159Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 08, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:45:30.234Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 08, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:45:30.281Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 08, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:45:30.316Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 08, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:45:30.351Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 08, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:45:30.791Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 08, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:45:30.866Z: Starting 5 workers in us-central1-f...
    Nov 08, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:45:45.673Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 08, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:45:55.910Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Nov 08, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:45:55.964Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Nov 08, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:46:06.238Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Nov 08, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:46:06.265Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Nov 08, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:46:16.546Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 08, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:46:17.824Z: Workers have started successfully.
    Nov 08, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:46:17.850Z: Workers have started successfully.
    Nov 08, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:46:54.767Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 08, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:46:54.926Z: Cleaning up.
    Nov 08, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:46:55.010Z: Stopping worker pool...
    Nov 08, 2020 6:48:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:48:05.991Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 08, 2020 6:48:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T18:48:06.036Z: Worker pool stopped.
    Nov 08, 2020 6:48:11 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-08_10_45_20-3176507254190115788 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0878048b-fc64-4757-9566-77cfb1c62a58 and timestamp: 2020-11-08T18:48:11.817000000Z:
                     Metric:                    Value:
                   read_time                    16.148
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 08, 2020 6:48:12 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 3 mins 5.664 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 56s
137 actionable tasks: 96 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/ydty6fma47nba

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1214

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1214/display/redirect>

Changes:


------------------------------------------
[...truncated 339.17 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 08, 2020 12:46:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 08, 2020 12:46:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-7Bt5IQhVCa3zqN8Hk0YR8CA-vj4uHc9PTOkgF47OAlA.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-rlPXscu-ZcT8LiQ_cvEevkFqOHTWf5bz33ny2fAue-s.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-eeJmB-rzDK-GQApfjfBujYcsZ_um-JKbBbppFIVKzno.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9024026951585539361.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-pEehudx8Ek77WDRTbS0DfsndoeYe2F-Und-VCBWI5dk.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-DA3vBAYfh0IxRUDDbIzGWpKfOdzN6d60xB9uEKbLn-o.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-kZV-5NOc743d6QW_ZDZOdnNGoaJQB4ZXol8anzetDdA.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-mlfnouy9Nu0XIYlC6YoSVvRj5QMjjPBRuFy3pjDidqE.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-BKm17QKoY7WUXnj0vD_0Awua_0IbTql4UR6sWIRdxNg.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-w1b-dan36iaIIZUxfn5F1c-zWlBsXTUjGvidEn6Pzxc.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-E3qR4zFvXs2a-yEg0sFzjt0glrwlS_FicQ1-tr7YW9s.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-shPX2cbm8arOl5r_q4_VaK7Lbgo8lN6aUF5SAZqRDi4.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-vuJhiU4pFmZz8KNQD_L7kVuh1R8zdfEYq8l2hxuABBE.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-GTUEwRmlvIrv2Fe-JtG54-g0KUlKA20fT6Ot_9ZDdqY.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-oc8yeQOwsMen8OpkIz78g6WwFZwD4EntqFLBWBxB2gQ.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-GQ1ltLqnv5fLVyanYc4Cx2IUQS06Xyw-SCYKu3n3kqw.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-b2wgr4c9Y1ffOp2fF0Pqhtm5uPWi7ot6vE7Hhs9itiM.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-1DpkZ96idf9tzn7tPCyLIVnRcoE92B2-N4abVUr57Pc.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-g4LnqCmEeBiup8wfuSd6lU_vF2gBu_WHqiYOQu-wMBI.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-trIrjCMQQuKNS9Svt-fYNVs-J1ynP251qLRYPKuHVk0.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-uGVqmRO1UZrQVkZIgK8taLotP0_Ll_bKQHapLGr1IlY.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-sr2gjWSbarsdiDli23fcSJ6SczFdcXDpsx1pxJ9GJRA.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-YznfAMApycqnNk0efXXw1NtJ9U80xns1LPFCyyf1V5g.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-jiYM7pmogBpN2YHG9tp_lUo9NjacvJqXVz7fItOayDo.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-2eVqpNbB2Jyg5tMYVAjz4kBaQdz97E44QsZ3R8MBBSE.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-7Bt5IQhVCa3zqN8Hk0YR8CA-vj4uHc9PTOkgF47OAlA.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-h0nL0Tma8lChPWULzQT0zKpPz54eeKKNBvPo56XSt8o.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-dVvxo_-VUWQbFXHfuy41pjJCR6HLXIJkm-F8g4qdczo.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-CwDiVTKaBt1dP4g49Nc8_r3KOd2upEU2NMiFRjsQIrk.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-uP1I7AcBeeNpwKS0jXJx0BZIa9ReZytc4uyouzxp9YA.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-MvoXRKfSg0A_6EJbEw2L1SwuERRykQDyh2YwCg8-yk8.jar
    Nov 08, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-Cwuu8cE2xllchdZJlxlNMeBV5A5JsBZ0RD7nWAcy1vc.jar
    Nov 08, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 0 seconds
    Nov 08, 2020 12:46:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 08, 2020 12:46:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 08, 2020 12:46:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 08, 2020 12:46:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 08, 2020 12:46:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 08, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash 0da519baf15eafb6b3d313a205de92f547deae930ed56185dd9f5be34f9073c3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-DaUZuvFer7az0xOiBd6S9UferpMO1WGF3Z9b40-Qc8M.pb
    Nov 08, 2020 12:46:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 08, 2020 12:46:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-08_04_46_50-12423472385825672518?project=apache-beam-testing
    Nov 08, 2020 12:46:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-08_04_46_50-12423472385825672518
    Nov 08, 2020 12:46:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-08_04_46_50-12423472385825672518
    Nov 08, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-08T12:46:50.557Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 08, 2020 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:46:59.671Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 08, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:00.263Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 08, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:00.296Z: Expanding GroupByKey operations into optimizable parts.
    Nov 08, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:00.318Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 08, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:00.389Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 08, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:00.419Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 08, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:00.446Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 08, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:00.477Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 08, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:00.807Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 08, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:00.866Z: Starting 5 workers in us-central1-f...
    Nov 08, 2020 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:13.338Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 08, 2020 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:32.404Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Nov 08, 2020 12:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:32.437Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Nov 08, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:42.714Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 08, 2020 12:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:54.283Z: Workers have started successfully.
    Nov 08, 2020 12:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:47:54.323Z: Workers have started successfully.
    Nov 08, 2020 12:48:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:48:32.999Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 08, 2020 12:48:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:48:33.165Z: Cleaning up.
    Nov 08, 2020 12:48:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:48:33.227Z: Stopping worker pool...
    Nov 08, 2020 12:49:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:49:17.876Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 08, 2020 12:49:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T12:49:17.922Z: Worker pool stopped.
    Nov 08, 2020 12:49:23 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-08_04_46_50-12423472385825672518 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): dba953c8-e64d-4769-8715-e464d8b26c50 and timestamp: 2020-11-08T12:49:23.148000000Z:
                     Metric:                    Value:
                   read_time                     16.76
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 08, 2020 12:49:23 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 46.965 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 5s
137 actionable tasks: 99 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/6ngdn6hoffwjq

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1213

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1213/display/redirect>

Changes:


------------------------------------------
[...truncated 337.83 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 08, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 08, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 08, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 08, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 08, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 08, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 08, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 08, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 08, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 08, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 08, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 08, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-HOhqH_IsRtsU9_yQQGRWd5g_jNao8_DqIgEHcBXSmkg.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-qlniDgeginNo_ZA5lw_LQ6-orKXs86lrZGQ7fKhg7EA.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8467387439516978617.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2NuHAXr4JFUxH0oR2JFGQsQsNXqb1pilPiV603buuoM.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-xWn3wZtZLdDffyLGaJYgKKkrZzA4IJjJU4UwHmzJvKk.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-l-EvFuHVmPHSdj6dIPL7H38F9sYumSJRtxX0Q7_FzI8.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-Fk14jIB1DdJPv0F4jnJCQ_iy3xVwSwW-mohzbaFWqmE.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-uTJe8FnSRgv_0ff5YUw_WlpxcTEVOuP-MUI3XQGJi1I.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-v2XsBhDFkUOBDecDxxxXkd_u8yRHqxFeu3Ic-Mn1Ip0.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-GSjDwfpufOdAUec1YnVTcCochTU7vHy9yFxLgTNARVU.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-S-H_14-sgGd_B6MEemlZPiTq2TS5QXEkZ-gTDTk50FE.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-7sR3krcuQxLU9rOJTOs7eZ9xPU8rDZSVdPzMnqfIBAM.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-_SHtGjMYqVboCdLw-3GFFcvG1li8eUxPWcZFFRTCrOk.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-bp6x1N1gs2XJ5ZU2v3hiEBK70uxYSxDIDTiE2-X8lds.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-Kpz6XgWQ-GwSiECuZBKvJhQREiAOge6exsdIsLR2B7E.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-W5oM679LN3f-tr29pvBNhGI5Tljdi-ML0UerC-lq2kg.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-eOwQcIIb8n5eBWmDPfuy5SImhHz2Dj2X_FFMPfjotKU.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-eI27YjkY_4kZ6rloY0GiJwd3lKVVrT1Z8CW67WGaY1w.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-etKDt-c5JIpa1tfXL90y86kV9kCK8udvE_Sad4LtuJo.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-HOhqH_IsRtsU9_yQQGRWd5g_jNao8_DqIgEHcBXSmkg.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-v-1EdjmbK5QUucCCvKYp4VkzohPZEry1CBt3JtnrJ3s.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-FEs5uTot__aZ-DtmJy7k1BC63C8M0hJR8mbQtnFhFps.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-mOtQ9011HAyplywXFo8VijLSjhnt2G0YSGd-LAucZNk.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-87R-siZWNf38nT6f55lojcO_GLwOM_5T1y7fQb3d2CQ.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-H3XSqxX8p5-ywfNhKpDpMgBhCfNSe0WEFt-qKuUI7ro.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-pvjT7tIAe9zk9OBxbMY5aQvpkigYcrvxvtsaQpW4R_E.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-0rR6ChMVwDJrxjhovlUCnSL4k-oj-Dwn7Z1iA9heJvQ.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-UvSZmUkp9miLdTymNS7hBLMEKlpXxfcvsFQHIlenaU0.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-HEAaHBB6o9nqkJSBtV0vRxw2AdAqv6nDH8fk3ThJ5FE.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-W4uf4dJM4ZaZGIQUy94L4e5vn2iKfD1B0JxAxVn2W9o.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-hykrisnKAFPlSLMVKb5JNBKF3oqepFLIkiXmhZJjtUQ.jar
    Nov 08, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-RlOyBC1txgtBHDYOoSBF728UWEWrSY-NuODfjuolaYM.jar
    Nov 08, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 08, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 08, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 08, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 08, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 08, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 08, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash 10206bc6fa67670e8248b2cf87064e8f514504610d03f3255248fc4ee64131b6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ECBrxvpnZw6CSLLPhwZOj1FFBGENA_MlUkj8TuZBMbY.pb
    Nov 08, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 08, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-07_22_45_21-18214865967071726860?project=apache-beam-testing
    Nov 08, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-07_22_45_21-18214865967071726860
    Nov 08, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-07_22_45_21-18214865967071726860
    Nov 08, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-08T06:45:21.864Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 08, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:45:29.407Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 08, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:45:30.072Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 08, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:45:30.112Z: Expanding GroupByKey operations into optimizable parts.
    Nov 08, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:45:30.138Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 08, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:45:30.211Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 08, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:45:30.252Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 08, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:45:30.285Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 08, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:45:30.320Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 08, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:45:30.763Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 08, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:45:30.835Z: Starting 5 workers in us-central1-f...
    Nov 08, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:45:47.525Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 08, 2020 6:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:45:58.606Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Nov 08, 2020 6:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:45:58.644Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Nov 08, 2020 6:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:46:18.853Z: Workers have started successfully.
    Nov 08, 2020 6:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:46:18.893Z: Workers have started successfully.
    Nov 08, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:46:39.925Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 08, 2020 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:46:54.656Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 08, 2020 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:46:54.812Z: Cleaning up.
    Nov 08, 2020 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:46:54.886Z: Stopping worker pool...
    Nov 08, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:47:44.402Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 08, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T06:47:44.452Z: Worker pool stopped.
    Nov 08, 2020 6:47:50 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-07_22_45_21-18214865967071726860 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2a560fb2-ccd3-4fbb-8382-80ebf37f4dce and timestamp: 2020-11-08T06:47:50.323000000Z:
                     Metric:                    Value:
                   read_time                    14.144
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 08, 2020 6:47:50 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 42.019 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 33s
137 actionable tasks: 96 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/biadseszqqqno

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1212

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1212/display/redirect>

Changes:


------------------------------------------
[...truncated 336.81 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 08, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 08, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-tFpyKNloC272c4vLuERoVrJQWzPBiLaUpTaPLRDl0ZM.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-hQrnzmXhS645kv8rTDYoeyKLNX5wK8Rlt0QpOg5IA9A.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-UTO_wHq1NZMFEX4972AUBwMvESdmd7sA7mmRN32cTnU.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-CDqYGcMMlQDwMOCpGLBhUAs5eew2hm56qWwlZQykzaw.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-3B0FTJS569kAGwnRFaePUczD2Js-JmrlDsyb5u-yBJ4.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-OQdbERlOd492kJJx9jMjByy3zYNq0KhnGTXoYgIdPrY.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-_hsiAPWiqWqPKJvIKXNFmA_hVWVKwsGgI4_DhDna5Cs.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-hv7uUmoOlY55ds8tzPnJ4mE0yKTIU-LAOHZISSnSR8E.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2890983678278828438.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-_5vAPwtjC2DMyjLlcfPFf_fp11FXWZuB0B8sVP0QGnk.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-e296t2roBFdMTDj5K7ZOxdhM_V4jlbxI_-YCas5VMOc.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-z8okFSfpJGcvtWS1TB9UrT_6IVYhPXfGRZS2Uv23hyY.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-uZUyg28U2MiD0gSySeqMfEBbemX7mm9sc1-UlwcvaLQ.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-Y4h9KztaXQcwm41wPzlFRXJXxZCu3qfJXzyBs2LUGYM.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-InURNHvOin09bJzkzskpb7IGQFHB1iYuiZKNx56TvcY.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-7vKt7_iyEbaeM0m2mEOTXr7m0Mbynu7enbROmt5uU_U.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-28frUB0xQLGlco_UHSwP4IkW4yCwhRKRYl6IRiq1960.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-ceXym6uOMN2Tcq-tmev9irHDLIt-AIjIC92jkFPHadY.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-Bf6zkAPQdRdMXLL6XgafdF2GgVqK1qiRhB2Llifscas.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-nFiuzkSZQ2_rAJtJZuTwe5cTGNdK8ObjWiFf287ckJE.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-tFpyKNloC272c4vLuERoVrJQWzPBiLaUpTaPLRDl0ZM.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-p_1FhTrp5v5DzkCuLojzRnV4pjJ8Rhvy5hBlQy_SjeE.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-rJ6IuVLpekjQm4r1-OWWXqHfdT2d-cpu0dfqNN1IwNw.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-2Bf5mYqE8nwRLHvSDqWhyWqpU-XK9xMCPOX0JmIg5Tg.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-RWvABZvV_ptF3mcJDQAXwyEPiegSvsWeJS381bOsysc.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-uTF6RYG3m1IakwyG_n7bOoi5ut4_fgPuH3Bw1o5YCcs.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-AHx2xP6mNB-8YtJSogLftHfpW8UKOpt8HST_Fb3kc00.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-mWG7pwryCPYfK42To2Oe2G5EpBInFZ18vUP_UNwA0Aw.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-4OpmWZpZbmGdLE1akb4pEzYJ6k9Z8D3x-kyVY-uKgzQ.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-sT-oniXXjZxRJGh9KhIvtzoZqN2pSv6d4WRrFRJHYmU.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-USOAtbnfVK33m4TzOFZLs-dhTNNGBAwwiut89EhK5nU.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-EKl2z8cZuvDPiY7JKg2ftdogiApd5phTVdoBHb2Paw4.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.visible-assertions/visible-assertions/2.1.2/20d31a578030ec8e941888537267d3123c2ad1c1/visible-assertions-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/visible-assertions-2.1.2-RQSulosjfNzcto_1sHqmOr5JkvkHp3w9YSCqm5BBQBw.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.duct-tape/duct-tape/1.0.8/92edc22a9ab2f3e17c9bf700aaee377d50e8b530/duct-tape-1.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/duct-tape-1.0.8-Mc7xLd7JedH4bXz3CMQaF9pSPQXGhf1mQunQsq3bckA.jar
    Nov 08, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/5.5.0/e0845217c4907822403912ad6828d8e0b256208/jna-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-5.5.0-swj66_5O1AnehBDgpjLRZLISawNfbqz_lo05CMr7TZ4.jar
    Nov 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 33 files newly uploaded in 11 seconds
    Nov 08, 2020 12:45:30 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash 17a36e479e8d2dc6669682e05f1be2cec2a4f05c8c36e09f171a36621735ab6c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-F6NuR56NLcZmloLgXxvizsKk8FyMNuCfFxo2Yhc1q2w.pb
    Nov 08, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 08, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-07_16_45_31-13157112929436058762?project=apache-beam-testing
    Nov 08, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-07_16_45_31-13157112929436058762
    Nov 08, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-07_16_45_31-13157112929436058762
    Nov 08, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-08T00:45:31.436Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:45:40.496Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:45:41.432Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:45:41.494Z: Expanding GroupByKey operations into optimizable parts.
    Nov 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:45:41.536Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:45:41.618Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:45:41.650Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:45:41.690Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:45:41.731Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 08, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:45:42.180Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 08, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:45:42.311Z: Starting 5 workers in us-central1-f...
    Nov 08, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:46:10.487Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 08, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:46:19.923Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 08, 2020 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:46:36.852Z: Workers have started successfully.
    Nov 08, 2020 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:46:36.902Z: Workers have started successfully.
    Nov 08, 2020 12:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:47:12.799Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 08, 2020 12:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:47:13.086Z: Cleaning up.
    Nov 08, 2020 12:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:47:13.227Z: Stopping worker pool...
    Nov 08, 2020 12:48:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:48:00.607Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 08, 2020 12:48:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-08T00:48:00.672Z: Worker pool stopped.
    Nov 08, 2020 12:48:06 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-07_16_45_31-13157112929436058762 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5c8bd753-cc3a-4270-958c-6eca7560d9ac and timestamp: 2020-11-08T00:48:06.689000000Z:
                     Metric:                    Value:
                   read_time                     16.38
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 08, 2020 12:48:07 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.037 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 59.654 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 50s
137 actionable tasks: 96 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/jzvoztnep6h4e

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1211

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1211/display/redirect>

Changes:


------------------------------------------
[...truncated 337.32 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 07, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 07, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 07, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-iFDHAapcCmshXxgXeYJlK9mG1T7SMN8dQLAqj5meIFY.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-D-u98BX6b99cZKH3dL9xO9Bxr5OfxCafNTrwo5AfOlw.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6768802342439508961.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-xSVMg71_lWDWrV618Gj6pmRwASq_N66NJ2uSWPeEU-c.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-LLHnYt4lBCAZKUmeS03wzUScTt3hyCdPUufUIlzMUNw.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-0SJTzVBcBuYe5nsiMrFDrEIVLy_p5atydK5cf-LTkLA.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-L5_C0GTO2yniQjvyfeQIchm3gvs-WCDmfhAueodp8Aw.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-JcTv16KW7QtGHB0P5YHUYVG1tKK8UVietPmiT6sDRpQ.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-4RK9rlNcJaY7c04jr-fkqk1JitGhI-gzq4v9jKeD2zs.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-_JP9QiTmHQUvryuciFfE7We1elhFevaRRnu0gCAeLmc.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-qPv38Kej4cibVMHBc4mk105jgF_TzDspPYUp9wgxgAo.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-DXzfRB7XuIMTdct-YRC91hptMx7r5GoVHYFN18jipGc.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-MUCck-eE6etV2Dplq7CcZhwDj8jFjtnvCTCu8ZSJcmU.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-jfcs0mZydZhNsIMAfyyLE0t-beFkuaoVy6jrK7cKXdI.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-ieKEeHe4T2d8umpm8aUn3apAZQmUOSpWEyFznXRKPj0.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-O8ITxXmzW_oOKWB_b9r7sJLXJP3LKTI1zdg6VOK5OPI.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-XxExUsv8rAsNXeOpExIHlPIFFh3q53P3OxOa65jwZBQ.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-Lc9n0hHn7jvnsjf4Ngum8gwH4lmaJL-4MEt6V5wQuFI.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-tq3oMgEjPfA5WdIldixBizKYaiFuwh6EVd_FLrdkbTg.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-BWxzBNK3QjZA4Sio66aF9XaCwtXVF88vAACwyCGRsYI.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-xCfoNIS4ZH0ekmNDsb6IX3cTrPlWxIoZMmMoh54qA8o.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-zMrDkeIxkMczEGUSFGMko847kyvrzz3hhXpKD8ZduQs.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-ngNby0LJmS3am8vMjC2wgQPDLWk5HTfbZ30BpnPbXWY.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-UCEfK7_YIHmiJPKe1qFWrzocB33RR574VhFnU_E84_g.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-1nvAiYiSz81UL18Hv4o6rpgT2UMhVeR-L9qztkkz_fg.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-WzuljKMMJKDKk-Xx_MX7jXWaKgRpZJNEU-wZv3o0hko.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-l42wYCIL0eETPgT4gE5Kq85RXcDXwDQEgHVCGWLxZyA.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-_whA1xQdUiV9L_-zM70TbI10Y1A7AFg2V3HW8sGd0EQ.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-iFDHAapcCmshXxgXeYJlK9mG1T7SMN8dQLAqj5meIFY.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-VNvne7sB7oR3M6jqxbIzL9A-1rHOZa4-WtZNe4r-LgU.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT--pJX_N-Vv2P0h6Q_1rLz94bOvy23vHt1GshcyDal1GA.jar
    Nov 07, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-8LnN9E70EonM_ARCrQT4FeBpjy31QRHTHqIPLwDM2lc.jar
    Nov 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <99000 bytes, hash 3edf471a4a4284dd77ceee8e5ed36ca55d579e6d0d0cec45154dccde95b59f3f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Pt9HGkpChN13zu6OXtNspV1Xnm0NDOxFFU3M3pW1nz8.pb
    Nov 07, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 07, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-07_10_45_19-4661573846680707546?project=apache-beam-testing
    Nov 07, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-07_10_45_19-4661573846680707546
    Nov 07, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-07_10_45_19-4661573846680707546
    Nov 07, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-07T18:45:19.935Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 07, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:45:28.337Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 07, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:45:29.064Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 07, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:45:29.094Z: Expanding GroupByKey operations into optimizable parts.
    Nov 07, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:45:29.121Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 07, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:45:29.179Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 07, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:45:29.199Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 07, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:45:29.224Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 07, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:45:29.253Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 07, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:45:29.532Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 07, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:45:29.596Z: Starting 5 workers in us-central1-f...
    Nov 07, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:45:39.529Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 07, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:45:57.244Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Nov 07, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:45:57.281Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Nov 07, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:46:07.574Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 07, 2020 6:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:46:16.423Z: Workers have started successfully.
    Nov 07, 2020 6:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:46:16.460Z: Workers have started successfully.
    Nov 07, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:46:48.510Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 07, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:46:48.660Z: Cleaning up.
    Nov 07, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:46:48.745Z: Stopping worker pool...
    Nov 07, 2020 6:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:47:31.927Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 07, 2020 6:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T18:47:31.978Z: Worker pool stopped.
    Nov 07, 2020 6:47:37 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-07_10_45_19-4661573846680707546 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 121462f2-181d-435a-a555-76034a5eb4d9 and timestamp: 2020-11-07T18:47:37.699000000Z:
                     Metric:                    Value:
                   read_time                    11.348
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 07, 2020 6:47:38 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 31.526 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 22s
137 actionable tasks: 96 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/kxwnzs2fhr6jw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1210

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1210/display/redirect>

Changes:


------------------------------------------
[...truncated 337.84 KB...]
    Nov 07, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 07, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 07, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 07, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 07, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 07, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 07, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 07, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 07, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-2h8LzMGClYYz7WYwo0kgvfIGxwxZsxhFl-0owlDHwt0.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-pOhFF5VrA56Qx9zMASjwSW98A34lcpL6SLBWJprvKkg.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-LYK9Umj9NCGrhh9uncOtwkD5uYE_PzeNNUkkvRNoe6E.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-J2rvXICloh-OZrtwTdj_pRmj5bXcki1hirLON9BYmR0.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-AaHwuT1C24xklh8VemhynQ6s-Ph2-ReTJOD-Yig3uFI.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-f0VlLtyEhK4okGvGHfClpPSz5NW_RGMQwsAsnDvHKC0.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-spwn7-AVilEoARB9Gd6liiUFt2LNXzIngOraw3nKXbo.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-N316IATLtulWZEiCRmGaJ_09Q6FI-aSH7piJaaR1JnI.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7986544634007327446.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-0IehUunnRg6sdmG-ggVMi3y7SBPempMizWAmwJS8oOA.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-Yl29qguE3YYIL-_l8vD9whIG86Omh5gytd72UkN5kx8.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-i8IPLQZ8qKSEFvgA-9zAdwtAU6WyzrHrHZAPKt_DsQE.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-14lXFpR7nPCJhUTADaCfWWBW0xdzRrMtyq6GHGCNJek.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-hQnlt-9u98BJ-f4OM1rgYakBNRRZqH4M4W-Zi2Fd9ok.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-2p6dmyhXITldK-mTZZIIZV3TJT-62w-GzByJ9mV8Xcs.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-wHsH5-B3Bnpc0t3EhlnN7iP2b6wi_MyWpahJk8QWg5U.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-lyfZ4UL0h6z5iQDTugx-pRmik9140WN6kUZJuZyoTac.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-1HM4Ax68Nxm38ecpvMJWeyALB5v6sVWQpKdXXKvwdE4.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-hYuBiDUMOVfHGxcvIV1Oq5r9gQk6qAbx5GW5DzEaCIU.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-nHqer-FhYuuCTpeDFoxZAQPypZJJRH7pOT0tZ0ttYwI.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-gSuFQsdWclilJnihENQjrBOuyWQ5B-pnAqgfB4-505E.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-2CsBnodpM29AOWaTYFD0VFVuj4MIBXzOizA1rIEka30.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-UJRTRuj9fxiPU5x7NxLfZ-GC7xkCNSt8S6cjoPrLIhQ.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-7GeeoU3UM08xv4maaVwNwVzRNHkK3UhmQeHCaxUZNBU.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-gdpKmBZlofZMHZkYhLMiRXvey7YZbwlg629G3SDNrUc.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-VFUSu1XXyOsG2uuFOcWg7HQj1U2irm2YqFm8dXeFBy4.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-rjvmEEs2_xWzQIVUyifbLYswiCu3sY5Nm4T2p0_R53A.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-5QlJh8878ksG6VFNulznCkjSHFe4nHWUQrKo5brk7r8.jar
    Nov 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-aCbHXHNCAeZoDTrb410LPgjhCxaqG1kaYDSeZhFpG1k.jar
    Nov 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-2h8LzMGClYYz7WYwo0kgvfIGxwxZsxhFl-0owlDHwt0.jar
    Nov 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-45lKEQdg7dCxfT9ytKGSrOWMcgUSehCjgFzfG3Y_rKk.jar
    Nov 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-BOcGfwrEeNvjZWYJhbYp8H3RJ0vXMfuew5gYKlboo9s.jar
    Nov 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-f7_hTJuMQWOGe6q95vG-dPxAEGEYNmgotcz68sJfWpM.jar
    Nov 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport-zerodep/3.2.5/7fca5e1998c889d486b01695f098ecf760a4ac90/docker-java-transport-zerodep-3.2.5.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-zerodep-3.2.5-OwcPjMMl-Jjxn-QgBEUudex_Jf9MIbZiSBhFLY5f9vY.jar
    Nov 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-api/3.2.5/8fe5c5e39f940ce58620e77cedc0a2a52d76f9d8/docker-java-api-3.2.5.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-api-3.2.5-eRxdklIyFsCE2_dXBCj1039ReXyYGkafqCMwJZWXx1I.jar
    Nov 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/4.0.0/deb6bf66918989b50209b8c9aaf3b2561af7f011/jna-platform-4.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-4.0.0-B21i7Yfna9yzdQ_-gKpJXuIRK9chJmOVTWVH9IJEkuk.jar
    Nov 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/testcontainers/1.15.0-rc2/f818b00ee47532c38b418e677a818ad4dca19db6/testcontainers-1.15.0-rc2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/testcontainers-1.15.0-rc2-avdkCap6A5JPV3hAULK21O7FqzEYvsRL8Ui8VniQ_Ug.jar
    Nov 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport/3.2.5/27af0ee7ebc2f5672e23ea64769497b5d55ce3ac/docker-java-transport-3.2.5.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-3.2.5-QA3UXPFvvCRXrSNnNwbFWAm4P7tT_L9aCJHzxaijRO8.jar
    Nov 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/kafka/1.15.0-rc2/47ff228024171dc5e0d938bc3221fae2878c1473/kafka-1.15.0-rc2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/kafka-1.15.0-rc2-bqsYOX1YvlCvptLLM_-rMZ--vSLZslIdDL-9ioqzWBU.jar
    Nov 07, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 37 files newly uploaded in 1 seconds
    Nov 07, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 07, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 07, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 07, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 07, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 07, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98998 bytes, hash 9e65e612a77405f78e4099533c5dae52353dff26d883be5a31ce64a700ba5620> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-nmXmEqd0BfeOQJlTPF2uUjU9_ybYg75aMc5kpwC6ViA.pb
    Nov 07, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 07, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-07_04_45_22-16804876202683058435?project=apache-beam-testing
    Nov 07, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-07_04_45_22-16804876202683058435
    Nov 07, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-07_04_45_22-16804876202683058435
    Nov 07, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-07T12:45:23.016Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 07, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:45:31.253Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 07, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:45:31.940Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 07, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:45:31.969Z: Expanding GroupByKey operations into optimizable parts.
    Nov 07, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:45:32.020Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 07, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:45:32.131Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 07, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:45:32.157Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 07, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:45:32.194Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 07, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:45:32.226Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 07, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:45:32.718Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 07, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:45:32.810Z: Starting 5 workers in us-central1-f...
    Nov 07, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:45:42.331Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 07, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:46:01.495Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Nov 07, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:46:01.531Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Nov 07, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:46:11.821Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 07, 2020 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:46:21.615Z: Workers have started successfully.
    Nov 07, 2020 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:46:21.655Z: Workers have started successfully.
    Nov 07, 2020 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:46:54.920Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 07, 2020 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:46:55.085Z: Cleaning up.
    Nov 07, 2020 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:46:55.185Z: Stopping worker pool...
    Nov 07, 2020 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:47:46.813Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 07, 2020 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T12:47:46.864Z: Worker pool stopped.
    Nov 07, 2020 12:47:52 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-07_04_45_22-16804876202683058435 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0126d07e-8bfa-4172-8909-0414de155519 and timestamp: 2020-11-07T12:47:52.821000000Z:
                     Metric:                    Value:
                   read_time                    13.843
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 07, 2020 12:47:53 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 44.349 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 36s
137 actionable tasks: 96 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/bpjoz32umdrjg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1209

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1209/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-10188] Include links in the release checklist to verify release is

[Kyle Weaver] [BEAM-10188] Make publishing its own step for extra clarity.

[Brian Hulette] listSubscription should remove TestPubsub's own sub


------------------------------------------
[...truncated 337.59 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 07, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 07, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 07, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 07, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 07, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-pDXv5ZE2FmD3W6OupqwXZtzE3Id0Vun1T_uzo6_Vjs4.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-8d5MR7vLiWPDgQ9-6wWV13LKxvOx_nlbltLhkUT57AY.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-T6601guo82APM6rO--Fm3JruQ4l5wz0AsaxDSLeSksE.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-XDhfeN7SQ0LgVQcGTOigkF19oYyM1HKZEIddYg2YLXc.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-92s9Wq8OcNWtXmaCtT8sEkecTyT05_-Z2-ti3je6zUw.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-6B5xi_2xahQQ_CLdE4aBjV2KTe12qjpnP_rVP-fdCGs.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-zAwEAsvjSvpWAiQ47Ge5y7ufWp99nuNJ8H_UmdP49mg.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-ON_MYvJ7d21ShwhPED4y3SbufFB9JyNNnb18X08OaRk.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-UcwWZ8dCFkYmm33tLkJsqx56Q6plBhXFJVOgjCRYHw4.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-900aLalxniNux-sA_M_6ts1ckmpiZuGXJ71-eXep0JE.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-8EhHP5BWYJozV4bTDmsZTXFGYfs-d-FrvxpMpfJ8CiU.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-YCfBf4Y7bCgTg44L8uK90xnKZ6oNCeM3L9631cYnZe8.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4813850466189361636.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-s4JElw_3CyG8GSx9O6fQbOE7D5wfdiVQIl1PJRnFidQ.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-nu5iFpb1pOkc9dw3sEWvYEYa459qFEbrRnIGH3vOSHw.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-YXkMs5_Pr2hFNx942ORpWyE46F5AcLSi3mLSxy85LIY.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-I2GzC4Y9ewqjQ8SiOT9nCFLITVLLL6dUuN5PhX8hSQg.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-9Qn3TckZ4fdLyioWE9edx6qzb40lTwtv2UEzWqM4d-o.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-oEoeFlds44aK14rO-O7ouiWFq98AJWN9uX09KsrL_Dw.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-_fZoLYD7_s-sxFUKe9AlAECV4jeDNDSryanf_G-iMzs.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-JRdMh8Zq-lfqD8l7gP85q8o_xMO3-wnutsAAUxdUlcY.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-_69SC0JHrnZwTid8mnjcVR1Zh5Be4CcUep0fGFcOQ7I.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-T6SNr9ngL5_fpyULNw8eN2gWOZhVkXVwncyARbTtP_U.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-pDXv5ZE2FmD3W6OupqwXZtzE3Id0Vun1T_uzo6_Vjs4.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-VMNDrJGDeBEAnFSKCWdvInT1w3_kS1P9xUGAoMEep-E.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-KSxvSox0o00OQJ-89ir3P-QykfjAhWcqdpmhSW0CG1U.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-NRPByd2NpMQCMa8vQbESm5ubAxoDGXpP6BaKAWaag8c.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-7eJKgS-Y4xnR2YX-HqtOcj6h8zqs-5sTAN4KkukITcs.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-WXbiK8zotwwg86Wz9qgZ23lad_oKQ54gLWCcNe4YX3U.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-XEtHddFwtcBQHBTOHyClPrUa5mKhw_sVvhzhwAdqS4Y.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-3M2xGnmXKII-nQtpdpVn34K6tyaQb8i2fcpC2b0golM.jar
    Nov 07, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-TFzJqPJ0qm1L9h0CEeeXsFu3hY5jwmBogQ7AKUneLmg.jar
    Nov 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <99003 bytes, hash 7805b6f078e29a31217c29d58b21443c245a4b533a53d1ef68ba726c9489e4f0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-eAW28HjimjEhfCnViyFEPCRaS1M6U9HvaLpybJSJ5PA.pb
    Nov 07, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 07, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-06_22_45_21-1998521765301790919?project=apache-beam-testing
    Nov 07, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-06_22_45_21-1998521765301790919
    Nov 07, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-06_22_45_21-1998521765301790919
    Nov 07, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-07T06:45:21.678Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 07, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:45:30.050Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 07, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:45:30.849Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 07, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:45:30.911Z: Expanding GroupByKey operations into optimizable parts.
    Nov 07, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:45:30.941Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 07, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:45:31.021Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 07, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:45:31.057Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 07, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:45:31.094Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 07, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:45:31.118Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 07, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:45:31.519Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 07, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:45:31.593Z: Starting 5 workers in us-central1-f...
    Nov 07, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:45:40.057Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 07, 2020 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:46:23.844Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Nov 07, 2020 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:46:23.923Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Nov 07, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:46:44.670Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 07, 2020 6:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:46:46.177Z: Workers have started successfully.
    Nov 07, 2020 6:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:46:46.210Z: Workers have started successfully.
    Nov 07, 2020 6:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:47:31.307Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 07, 2020 6:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:47:31.462Z: Cleaning up.
    Nov 07, 2020 6:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:47:31.533Z: Stopping worker pool...
    Nov 07, 2020 6:48:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:48:37.237Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 07, 2020 6:48:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T06:48:37.276Z: Worker pool stopped.
    Nov 07, 2020 6:48:43 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-06_22_45_21-1998521765301790919 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e6514e63-f40f-4853-b642-2a3a1b210e1d and timestamp: 2020-11-07T06:48:43.617000000Z:
                     Metric:                    Value:
                   read_time                    25.674
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 07, 2020 6:48:44 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 3 mins 36.448 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 27s
137 actionable tasks: 96 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/3yytklym3hsda

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1208

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1208/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11200] Update useragent version of Go SDK (#13279)


------------------------------------------
[...truncated 343.65 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 07, 2020 12:47:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 07, 2020 12:47:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 12:47:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 07, 2020 12:47:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 07, 2020 12:47:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 12:47:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 07, 2020 12:47:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 07, 2020 12:47:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 07, 2020 12:47:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 12:47:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 07, 2020 12:47:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 07, 2020 12:47:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 07, 2020 12:47:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 07, 2020 12:47:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 07, 2020 12:47:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 07, 2020 12:47:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 07, 2020 12:47:31 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 07, 2020 12:47:31 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-HtfOYJsPTRnRM3usTi47WnVG-w78pa6-cPPMb-9WJsU.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-xYfV-mM6If-k_IxBNyDfi5QkZFrvrYfx4TPXJgwW8MY.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-qEwOYtG6es03Vf5WzdjkMERjU_c5q2kySoNGOCtGsdQ.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-d_y3CnO3NTQbMdBxxTFqI-ZfJm2NFuKXHG-SF3sY55E.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4114248645473349463.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-dQHubp6F_02KwRARwZ_ZnFF6F4zrKWp7AM2sPVcyuhc.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-J6HgJUlo4EznucPF6DgjLSMLeTTJ1LKlZFUWlf5qLQQ.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-7kEbtjynH7-A8rd5tPWMtdA32HKtl_XZq8gJeE9SX7w.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-vhuISAXTEbUSjShMsPvM46N_jSc4VIYke0WtjWVTGSY.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-VqWC5FR5J3Ps1k_EllLKBEjbsDrChEpBNZJvbCsMoLg.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-qh3GF3IQgcl9x2ZIr9-YJCnT6-lnDFxVttNWxU9n6t0.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-QHELwrOgdfA0Mi-ly2gy_7HOBtS3F11YDKUnz4ZbnlQ.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-o14gE4vagvRxv2HH-XVr0qvM37Pgpq_MdSFtaTOIXi4.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-TW4KFsfNCQbwOM9aodVuOBm8Szon7xTjhKEPdtQeDLQ.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-cznj0WyZOB0-B17TtBBUI3dAbD6n0gs0BwP64TKEyxM.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-AgEN_V_uR_Li07Rsy7ErJ2tTGdXz_y3wVu62SqM1bLE.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-MXsCCQRTwJk1528XpQ3kZdDZlsSA28Xl4_OwRN7cN-M.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-HtfOYJsPTRnRM3usTi47WnVG-w78pa6-cPPMb-9WJsU.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-x-h8oKfr4XkP2yPCcydlmPDpyTkmYo9mEj9DPRFY_L4.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-gAEqvp0sTIHwNWDqQzxAuMduN1nY-wZ-vTz-BKL0KCU.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-DNRSRC42DlQDLlhjd58Ugpn8kE65LfjaAmtd-lAYDzE.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-oR8ehkZtVDPArC9LDmsK4GyFrTcytAlUVtp43qKfGyw.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-q5T0X4P8fW3jRuEJThki0zT2C3ReCcradbc8cvvUlos.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-FyQ0BMoy-mozEhknKz2fkk9W82SIR0YtEXje_U_xLjM.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-SWIppCCTeqq9WU546ABC204TKv_zGeKi9oLsOQck91U.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-ZJ6Y2XuePb02A6NAJPTDyWqdcD7Dg8Kudetw-qWCo6I.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-Ykj9gNgBrSi6x_jbGvamLV6ap6448kolEb8G11LMReM.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-9WjLM2TsljyjPG8jcRCBpePhRiBmyqq_RMRXS-Cwqa8.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-7tubNP0uIRv31AELBq7KlKbU7DN2d8EER5EF9ZCssso.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-wqy39cIyPvnRf9wkzH74w4mdu_7aqhQ0GJPPTJRDAHs.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-1yEhd2n2QcYaZCatk7ClRTXcx5d8hXAJwYbb-PksRc8.jar
    Nov 07, 2020 12:47:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-hefUuKdR7OUafFUazqyA0R5brVOrLkOU_Z4DVWsS5kY.jar
    Nov 07, 2020 12:47:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 07, 2020 12:47:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 07, 2020 12:47:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 07, 2020 12:47:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 07, 2020 12:47:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 07, 2020 12:47:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 07, 2020 12:47:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash ed2e78f92cde01c601bc16ce4a5d4b1383460a011f18b3b44c72b00f8b02f6d1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7S54-SzeAcYBvBbOSl1LE4NGCgEfGLO0THKwD4sC9tE.pb
    Nov 07, 2020 12:47:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 07, 2020 12:47:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-06_16_47_33-17423228733461652333?project=apache-beam-testing
    Nov 07, 2020 12:47:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-06_16_47_33-17423228733461652333
    Nov 07, 2020 12:47:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-06_16_47_33-17423228733461652333
    Nov 07, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-07T00:47:33.768Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 07, 2020 12:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:47:41.932Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 07, 2020 12:47:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:47:42.719Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 07, 2020 12:47:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:47:42.754Z: Expanding GroupByKey operations into optimizable parts.
    Nov 07, 2020 12:47:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:47:42.791Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 07, 2020 12:47:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:47:42.885Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 07, 2020 12:47:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:47:42.917Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 07, 2020 12:47:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:47:42.955Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 07, 2020 12:47:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:47:42.987Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 07, 2020 12:47:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:47:43.322Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 07, 2020 12:47:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:47:43.390Z: Starting 5 workers in us-central1-f...
    Nov 07, 2020 12:48:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:48:16.760Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 07, 2020 12:48:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:48:18.618Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 07, 2020 12:48:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:48:30.493Z: Workers have started successfully.
    Nov 07, 2020 12:48:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:48:30.532Z: Workers have started successfully.
    Nov 07, 2020 12:49:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:49:11.097Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 07, 2020 12:49:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:49:11.263Z: Cleaning up.
    Nov 07, 2020 12:49:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:49:11.356Z: Stopping worker pool...
    Nov 07, 2020 12:50:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:50:02.802Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 07, 2020 12:50:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-07T00:50:02.848Z: Worker pool stopped.
    Nov 07, 2020 12:50:09 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-06_16_47_33-17423228733461652333 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2116191a-eb98-4fd4-9291-dd251537f3ea and timestamp: 2020-11-07T00:50:09.317000000Z:
                     Metric:                    Value:
                   read_time                    20.654
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 07, 2020 12:50:09 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 51.36 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 52s
137 actionable tasks: 99 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/jbeuejadez2zg

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1207

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1207/display/redirect>

Changes:


------------------------------------------
[...truncated 337.63 KB...]
    INFO: BigQuery method is set to: DEFAULT
    Nov 06, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 06, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 06, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 06, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 06, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 06, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 06, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 06, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 06, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 06, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 06, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-Oa7o8O8xGL7qwzRVLU3liK8jkmWsJ5xAxWXNp30vry0.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-Rutu-ZcXiook6pNmyWxdzQ6uY0qL68W8mLv9yRCVF_8.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-zntEtx1pcK6YwbVeDu4PLDE7TgN29oYaT6LGbwdrYU0.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-iVaoyeIjJDT8puJCE2xifsFZoB0h7fUAe2RnzfKRvxk.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2378599876861574908.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qVsOaEXB2ID4_l8XDm5u2gdwfgtieO3ekibsxL0bVg8.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-upr4EdnpNPq5Pbb4utfKHnziqNLO3-G_YpzjnkdK6B0.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-jDKO324MLkDDAeULn5TWffRrC77OOgjerGh2KwVS6kg.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-fIt0g7EQBrGYs4ZEsKnkt3hAm4Fh0K-bVOxvj2P3bnw.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-0btyq3NFqDRsOR_D-Cslo4mbX98BtTT_xNtN5dIHmVQ.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-YiZo_ugzvzzBbwLA4na5gM6A09AOUP1nYgyL6JBNz_Y.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-Rul_qNAe7NTLHMtHLIOBlTWwkgeCNXVts36nN1lsJyM.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-I9t6vC65fXwIkKyW8OjvRg93EcQMmEtp3RvHh3duNZg.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-njAxqCIK4m70jHPMl61svOc7n-uqDrc775QKmqLUJIk.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-rQ9vvZICMeEz47Jx9l_g8_VUpvfuTr8juy7swsuN8Hw.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-4Z8P2BArEADD1ZXzkFkwERaLkrOqC4gyydzEVcUBWtU.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-MY-gCCgsCemeucukOXfhHYVtsW-azM3zt2NFsi2Gy3I.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-g8gMKnnRLNJIvgTT2tkf3YtAmQtLYayv5rxLDhnja5s.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-1iNSE2H1YUmDawUVtxq5kKJqngflANnLOfcKWz_J-Ig.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-YgHXguwnOmMMl0ZhNUqIBFuW2ZkoI2eD_PyehZl8Uto.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-kfTMt7qeru8WCTgV-LhSgqpSqmSh4O125CzWlxMvq1w.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-JbNUQfj1CdZ67S2xiuzm9eyuL5aXfnNgnAyARa2wZPs.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-68ypq7-ymSgWPrv9SPkxWF_sCKlyytOtHpzChCxzPmo.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-Oa7o8O8xGL7qwzRVLU3liK8jkmWsJ5xAxWXNp30vry0.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-LapWvaGpXamAzNocsAEY0STtCSWz_XTalRY8qqZKVs4.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-EJ9YNVs9YA-jCHks-ytLu5QjfBQcyUzPmC7weDCaPWM.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-5AsrCjP1ZVV4h2KquNoxWdh1nhFFP1Iwd6zw5GTFr1o.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-sdpp9L3vaoKjjdytpjjnNgAdGUSsoM_7HL_R5-_Ol30.jar
    Nov 06, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-Z-ROjtNNfdmJIk9V0e2UZAEr8E4yjtas-hu2Ffyhzbs.jar
    Nov 06, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-4u4QcFGU3SlVGKMwCtPHDBFlZ4x3pK6OEI6iS0MAz88.jar
    Nov 06, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-4gjOQunCGiT7T71RAFtiUX8QWc1GKxYLpzrkLRF4HFE.jar
    Nov 06, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-Zn-tKtCeBZ4TmjgzVz5HBPQxu9nLRhuPRPats7ghMo0.jar
    Nov 06, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 0 seconds
    Nov 06, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 06, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 06, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 06, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 06, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 06, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash d16f3ac2d3e20815596c4f1f81dde57a052e2bac180e4c4f13cf2b69af95df1d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0W86wtPiCBVZbE8fgd3legUuK6wYDkxPE88raa-V3x0.pb
    Nov 06, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 06, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-06_10_45_22-3349206509671855995?project=apache-beam-testing
    Nov 06, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-06_10_45_22-3349206509671855995
    Nov 06, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-06_10_45_22-3349206509671855995
    Nov 06, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-06T18:45:22.109Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 06, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:45:30.449Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 06, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:45:31.158Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 06, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:45:31.203Z: Expanding GroupByKey operations into optimizable parts.
    Nov 06, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:45:31.230Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 06, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:45:31.315Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 06, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:45:31.343Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 06, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:45:31.369Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 06, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:45:31.419Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 06, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:45:31.908Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 06, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:45:31.982Z: Starting 5 workers in us-central1-f...
    Nov 06, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:45:45.308Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 06, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:45:55.263Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Nov 06, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:45:55.291Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Nov 06, 2020 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:46:05.534Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Nov 06, 2020 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:46:05.563Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Nov 06, 2020 6:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:46:15.832Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 06, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:46:20.049Z: Workers have started successfully.
    Nov 06, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:46:20.089Z: Workers have started successfully.
    Nov 06, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:46:56.862Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 06, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:46:57.076Z: Cleaning up.
    Nov 06, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:46:57.343Z: Stopping worker pool...
    Nov 06, 2020 6:48:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:48:12.669Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 06, 2020 6:48:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T18:48:12.723Z: Worker pool stopped.
    Nov 06, 2020 6:48:18 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-06_10_45_22-3349206509671855995 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e9c72126-7ae8-4f9b-9a6a-6cf6e47cd3c9 and timestamp: 2020-11-06T18:48:18.767000000Z:
                     Metric:                    Value:
                   read_time                    15.538
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 06, 2020 6:48:19 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 3 mins 10.658 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 3s
137 actionable tasks: 96 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/5maabjiyljzrq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1206

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1206/display/redirect>

Changes:


------------------------------------------
[...truncated 335.66 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 06, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 06, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 06, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 06, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 06, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 06, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 06, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 06, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 06, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 06, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 06, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 06, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 06, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 06, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-wLfHFEixi1-VB2M5v2pGACmCQpCBP4JtAdUxJ9kaEm4.jar
    Nov 06, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-1JQb6I8Yzcts9USApQ0lSFAF57jbhs_whUj4fLX6q8g.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-vudAApY_eq94OGwEP38CfqnbOKmrTZUdc-ERTW3d_VA.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-BiiBRWt7_xDQWWvzVXxBCwlxzizNIKay1MvGOTTgsQ4.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-bLdTIJiLVq_66_fjoCTdB32aMxZzv4C2UYsU0dFuL9Y.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7273588107002246040.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-TttO2v5GI133EpPi7XJlCfH4hEUgtcx30cbKKgrhZn8.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-qBowHtT1hWDtp_v7E_j6dafEZYpCqwAhl5mS2x1lP-k.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-jW1rZ-WluHsirzcYn17Zt1nT_VVSeHDpvvwqfoz5Ifc.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-c4bDag9G7g7LJgvM1Lb5bZa-1sdIsg2R8_aqC0jXuzs.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-ZnFvMO83DY3DAtLT4KwKwLoo5vykx-00AkIh-e6Y1r0.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-adbSpDdSjxYlKVhxyLRsNFVduGnp5kX0ZyUEp185VVs.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-FuXkn5XDX5pIP1qymky1IfdP9PPtpUZPJcTZ9shc780.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-3tKCmOoYoST005zHxPVTTWAbp1oY2sGb4t7sFLOdMwA.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-5QYEbNfExcfvmbsKdKFbK7Oo7cHU0_OYGh3UGnPZBMI.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-wLfHFEixi1-VB2M5v2pGACmCQpCBP4JtAdUxJ9kaEm4.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-gckYp1M6DGf4FccACWfXPANT666U-rMgzVtWmz1PMEE.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-rwvt-M38f51hk5ie-FMLv47UvXYV1sclpnRomUoijPQ.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-0N7HCT7ToLVbDq2dH8wvRZydYmLWtJxWBFncJMJA-mo.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-9gRgcSnbNgzUXGEkp1gprr_hld_YrC3FOiy1ztU6Sp8.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-Ldy7lit8lDdh2EhZm3wPRagnQwgCWjo7T0e-A7_7a8s.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-iT88dPxxJD7yGM3a_4ELPmEv04Ehb8OLyfcyCMjxyjk.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-QKgxuzSLwTWgjxYXdNANtq3dxTRxrG_gwilUMCXpk-Q.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-8coIO_Q_DGaPoG_rfSZxvKwUzJRDJN0-DBLZ7nhDd-E.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-eKX6BmkwhCexu-eUsgfdLO0zCpMUgVjN258aBvGMCb0.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-_dZSSrXQtPeJcDVt0fum3AFUfXqt4bBgWXa1v_bEqpI.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-0s1gQoX5kWYkSGtnRTLVcqZed4uExAEdU8oeZbG6He4.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-vUilstSc0Tgsmebi31Rx2QCUHQsbyM3ZLdwbYYQSKRg.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-b7tvCYFo0LpTlTGNkqwsgiq2wyhdYpBxuF9xl25V0jo.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-r12InUxw8SStGFQGwwI-3hzDGEJUuQYe3AFQzoQTEpM.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-3QdE1vyHZsXkhzvT6Tk14XuhnZ34ImgSgxDNxnzV61g.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-HvC6KiWAWRXZOG6i2rtq9Y9VubUcmnb36cmhKLXZmf8.jar
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 06, 2020 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 06, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 06, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 06, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 06, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 06, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash 55d19b9f9c512c02bb73728302240230b565679325c952a57e0c0eed0cc5c33d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-VdGbn5xRLAK7c3KDAiQCMLVlZ5MlyVKlfgwO7QzFwz0.pb
    Nov 06, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 06, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-06_04_45_28-1344194947846049697?project=apache-beam-testing
    Nov 06, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-06_04_45_28-1344194947846049697
    Nov 06, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-06_04_45_28-1344194947846049697
    Nov 06, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-06T12:45:28.465Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 06, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:45:36.805Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 06, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:45:37.493Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 06, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:45:37.533Z: Expanding GroupByKey operations into optimizable parts.
    Nov 06, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:45:37.572Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 06, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:45:37.653Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 06, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:45:37.682Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 06, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:45:37.711Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 06, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:45:37.744Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 06, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:45:38.344Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 06, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:45:38.428Z: Starting 5 workers in us-central1-f...
    Nov 06, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:45:52.595Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 06, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:46:01.381Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 06, 2020 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:46:20.722Z: Workers have started successfully.
    Nov 06, 2020 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:46:20.755Z: Workers have started successfully.
    Nov 06, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:46:52.415Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 06, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:46:52.578Z: Cleaning up.
    Nov 06, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:46:52.648Z: Stopping worker pool...
    Nov 06, 2020 12:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:47:44.275Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 06, 2020 12:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T12:47:44.353Z: Worker pool stopped.
    Nov 06, 2020 12:47:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-06_04_45_28-1344194947846049697 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 601c68d2-a783-4237-bb70-3c701124d2b2 and timestamp: 2020-11-06T12:47:50.708000000Z:
                     Metric:                    Value:
                   read_time                    11.609
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 06, 2020 12:47:51 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 37.089 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 33s
137 actionable tasks: 96 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/uz6yhru7hmvr6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1205

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1205/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11159] Use GCP pubsub client for TestPubsub (#13273)


------------------------------------------
[...truncated 350.54 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 06, 2020 6:47:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 06, 2020 6:47:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 6:47:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 06, 2020 6:47:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 06, 2020 6:47:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 6:47:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 06, 2020 6:47:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 06, 2020 6:47:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 06, 2020 6:47:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 6:47:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 06, 2020 6:47:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 06, 2020 6:47:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 6:47:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 06, 2020 6:47:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 06, 2020 6:47:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 06, 2020 6:47:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-IV6j5NE6rXiLHcpHqUgnjHWzKkDWjqFjbC6Sa_dgmPk.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-Pdn_gzWxohAfyT7Tc2im6RFwkjb-zoV8GKHaNN7snMg.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-rEuj1xAoxHA9jYOWnQN3ONSgU_RFMu15eTUEyYkK6cg.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-OY-8MQ8bNB7q5ZD2NXSmolblhcFw2gQAJvYEdcwxvBk.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-0aRYI8jG0JcIPaokYEy_HLeAwFWDbiPzk3Uovlw5zHg.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-7Q3d_CrnVrLZO6ABAIQOtG3XL52XTudL0rgmUDGhPU4.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-B2MC-rStBoiUSvkVdBgG_xNyAXTmvT852ROPBCgDRlY.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-8-UjvPK1_luvwODKM9Yva-P0sEnQUx6X5En2j8cuUHw.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-MTIv-k-5oWrAI5Z8iGhLewRinqg7RJ9LmrB7xGpUlv0.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-8J8-9LUj6KYE9zbObDYjTy2_UugeVQL8yhfGOB8LAsQ.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-DKKqr5In3er3ReqhIleYgV5UvH_rVr4cvZCn0pvmClg.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1990562829729254004.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-cnluwkyFuXuoixFiv_auhwCUh93s8Wbzmchz8VJQy2c.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-lut5RRoWVVpgGAt1PSKiELiqHoOLLmVZ5PWzk9LQ2EE.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-wjJq0ID87auPX-kZyK2Tfbq_t3Hotl5VB3sGQD3reUA.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-VVD_qA1ZwaHoNTCbcPaZ1sIiTiwv-kRaelwvWwcYJaw.jar
    Nov 06, 2020 6:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-2TYk0ez01UN5-9uipspe66deAM-tZ_0R4rHwSJ_NUds.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-wFc1zT0DdusSlqoLKJcVEfPbPvqAByyTXSMdUHPNe40.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-J7QDCbgRAjCW211dQmWr1y2q93oOrcGCHtudKorNUNE.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-urPsIsRVvpihyJwsJiRWv0_ycAwS1T6a7zQqPJNyrAw.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-gupAo-QJfTEeMEZn8aCgIEsnrt-dCDLAoxSf1ab8VHs.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-fQntH1gOO6OnmHxCMcx85-rVvsisCXJuvq9K9jdTF58.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-4sYzD8XJ6brHwCvCwbNvULlycWRCA8vh2lBBejX5KrY.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-RiWnFNv9CffM3hgJV0g0qPHjYvmEX9pfePSTfP7qz-E.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-wm0B9zN2dMUxyMapQUdOj6EzhKAl6pOAsSHlAHcJ6Hg.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-D7A7FvKfYWrHojGjPv8Hv1d0JdHN1JBv6zL65hgAAW4.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-IV6j5NE6rXiLHcpHqUgnjHWzKkDWjqFjbC6Sa_dgmPk.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-PB6Oz2EtsXoCtRkXjiNf2HNeMQazMa1KZ3zqGjZF2yg.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-gawyh2yCkZTr4BNrRS-vdb16vjKWN49cqU89CfLgGfQ.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-yvzUW2ThDfnct6EN_jRjToVyAx2nuERLObA3-8NC_Xc.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-0OD362Z-y84stDtEx7Ki8MiZDuRVmizzQkXNM0vEo_8.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-QnqgS7jvJqJyQPfbd14nmQDkvE8_pHcJd5Yi5R7hsLo.jar
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 0 seconds
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 06, 2020 6:47:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 06, 2020 6:47:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 06, 2020 6:47:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 06, 2020 6:47:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash 41a36b4339505a64883b6403b29cc66f614e5d0152c73b46c32bdeb1e4a2e076> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-QaNrQzlQWmSIO2QDspzGb2FOXQFSxztGwyveseSi4HY.pb
    Nov 06, 2020 6:47:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 06, 2020 6:47:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-05_22_47_25-7906395366874687293?project=apache-beam-testing
    Nov 06, 2020 6:47:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-05_22_47_25-7906395366874687293
    Nov 06, 2020 6:47:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-05_22_47_25-7906395366874687293
    Nov 06, 2020 6:47:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-06T06:47:25.373Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 06, 2020 6:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:47:34.703Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 06, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:47:35.563Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 06, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:47:35.602Z: Expanding GroupByKey operations into optimizable parts.
    Nov 06, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:47:35.627Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 06, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:47:35.710Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 06, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:47:35.735Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 06, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:47:35.771Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 06, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:47:35.807Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 06, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:47:36.367Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 06, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:47:36.454Z: Starting 5 workers in us-central1-f...
    Nov 06, 2020 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:47:42.007Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 06, 2020 6:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:48:02.362Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Nov 06, 2020 6:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:48:02.439Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Nov 06, 2020 6:48:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:48:07.710Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 06, 2020 6:48:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:48:22.241Z: Workers have started successfully.
    Nov 06, 2020 6:48:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:48:22.280Z: Workers have started successfully.
    Nov 06, 2020 6:48:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:48:53.544Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 06, 2020 6:48:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:48:54.013Z: Cleaning up.
    Nov 06, 2020 6:48:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:48:54.381Z: Stopping worker pool...
    Nov 06, 2020 6:49:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:49:46.251Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 06, 2020 6:49:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T06:49:46.332Z: Worker pool stopped.
    Nov 06, 2020 6:49:52 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-05_22_47_25-7906395366874687293 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0b97cc59-24c9-42b9-9657-466dd30c194d and timestamp: 2020-11-06T06:49:52.353000000Z:
                     Metric:                    Value:
                   read_time                    11.283
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 06, 2020 6:49:52 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 40.951 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 35s
137 actionable tasks: 104 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/gnxhyd2jyxlyc

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1204

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1204/display/redirect?page=changes>

Changes:

[Andrew Pilloud] [BEAM-11165] ZetaSQL Calc only convert referenced columns

[Robin Qiu] Support read/write ZetaSQL DATETIME/NUMERIC types from/to BigQuery

[Robin Qiu] Address comments

[Robert Burke] Moving to 2.27.0-SNAPSHOT on master branch.

[Andrew Pilloud] [BEAM-11165] Use the ZetaSQL Streaming API synchronously


------------------------------------------
[...truncated 346.22 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 06, 2020 12:47:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 06, 2020 12:47:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 12:47:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 06, 2020 12:47:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 06, 2020 12:47:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 12:47:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 06, 2020 12:47:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 06, 2020 12:47:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 06, 2020 12:47:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 12:47:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 06, 2020 12:47:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 06, 2020 12:47:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 06, 2020 12:47:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 06, 2020 12:47:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 06, 2020 12:47:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 06, 2020 12:47:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-hMCzvO6FLnViuBpdjLlDMzS8_idb8isSH4uIW8f0n6U.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-DFFWTi4GgVLC4hqOtYeZD77zIqKlzWlPUHqikotPwM0.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4266622724430617620.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-bMuNF5D2G_mWaHgV6HT3m9BcckYtF4OMaPghvBLPPaM.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-DhyOzZOkP8m0eeF-L7HiqG-KaEKYRfCbsOabI5o-iwk.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-P5qBTYoFS5h2fqc0Mkzt57U4G-oHuQkUFrFHLnNzAII.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.27.0-SNAPSHOT-tests-ICC8ww2GSmJKJr8MCbNxV86LrHTRu81soo_zqS1W7mU.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-P4yN43zeOVackX8s4Zm88QEEw1FA19hYWVq7I2Cub8s.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-UNFcn5vpdzh2kMQtbrMuBruJFcUIZZo5hTl0N_kCOI4.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-tests-HvgE5JXEx4xfPyiURTti3LCRWbmQc3r7IwFXyPM7w-I.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-FnePoCd6YcFFbHC_zYtD2qjubYa_N-mRHummMiPk2FE.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.27.0-SNAPSHOT-tests-XSp_YvBKuCMxakYV3_i91LvGD4d-A7mWcqkDKDiaFQU.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.27.0-SNAPSHOT-KNcDXCHu_zNi4qEAMrKU1Jop_D7jgey8pmj0c53DWp8.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-I7VIZgd0-aYZ_ZaWF6aWRaZFYOf_50bm2Z_OH3QZ4v4.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.27.0-SNAPSHOT-tests-JMvLDv47r0jeHXxM9nhEfb8ngAGPWAtRQp17oR9cDg4.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-tests-bt_XY1BAy52SUhlgh_o7pu2ymOZgnoaS_uCJUqEKCMM.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.27.0-SNAPSHOT-tests-a1nU5ZFIWlaTpNMuqUNE0fN52gi7wvpqvcBcMdotg-Y.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-tests-GSXJql12kRI2b4P0lyiscibxHfMUsMJNCSts_NDTl2g.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.27.0-SNAPSHOT-ZzkBSrJTCPZLdBMHBIZdq9Ka5Pr2yHV1Q4j6fH1R9Wk.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.27.0-SNAPSHOT-nuErkGHkih_BEeurEg-i72Ap1f_5rFp9Lrak5EWGyuA.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-9lnYkwmP6sb2TZZP78pa2Wt8LCrqLF2xXFWc1nwON94.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.27.0-SNAPSHOT-b_6CGwVxZiP9aXqAMfKuUMpz0sS1QiEznKEAh4c_9HQ.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-_WCDA_0KxCyVyYLgqa8hzP8sIUl6CHP_bLf9XmwI4is.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.27.0-SNAPSHOT-xxZf76KvOL66wO0V8_j2-o9hTjgsByH74728xO5Qayc.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-ZApaP8_lwywFWRE5WkVf6gQ9iBEiQbLmKQqXjpWV9S0.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-U6HEuVjvjhqG19hsP6D6A3cyqIR_nfYU2lB9u78m7sU.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.27.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.27.0-SNAPSHOT-tests-U6IX4FZStlmQAjOwZtHxeRGzm4KovJIYylAityvdFlY.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.27.0-SNAPSHOT-Ayt-nmaRodRv9Ryb95HuZPUWG5ZLjPPBZdReFxTxuvs.jar
    Nov 06, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.27.0-SNAPSHOT-hMCzvO6FLnViuBpdjLlDMzS8_idb8isSH4uIW8f0n6U.jar
    Nov 06, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.27.0-SNAPSHOT-FRsoxLUVszS2iuKD18toaGXOgJQgxCdAOJ01EicaedQ.jar
    Nov 06, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.27.0-SNAPSHOT-yGZJ1fxzD-KVkQrmW7zsrsjcZAYDyPHZmV9YYhPOxZA.jar
    Nov 06, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.27.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.27.0-SNAPSHOT-VBxf31bBSXnlN_huu1wRO_0zGn1hfZF2eSPtU9h4HrM.jar
    Nov 06, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 06, 2020 12:47:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 06, 2020 12:47:23 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 06, 2020 12:47:23 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 06, 2020 12:47:23 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 06, 2020 12:47:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 06, 2020 12:47:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash e5a9ae63614911066d9d4a9bdf483fef767a7bdc7a8b2f7a8b849ff0cca83b44> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-5amuY2FJEQZtnUqb30g_73Z6e9x6iy96i4Sf8MyoO0Q.pb
    Nov 06, 2020 12:47:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.27.0-SNAPSHOT
    Nov 06, 2020 12:47:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-05_16_47_23-11530165549556791506?project=apache-beam-testing
    Nov 06, 2020 12:47:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-05_16_47_23-11530165549556791506
    Nov 06, 2020 12:47:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-05_16_47_23-11530165549556791506
    Nov 06, 2020 12:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-06T00:47:23.469Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 06, 2020 12:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:47:32.626Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 06, 2020 12:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:47:33.224Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 06, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:47:33.265Z: Expanding GroupByKey operations into optimizable parts.
    Nov 06, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:47:33.300Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 06, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:47:33.388Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 06, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:47:33.421Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 06, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:47:33.457Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 06, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:47:33.492Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 06, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:47:34.251Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 06, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:47:34.320Z: Starting 5 workers in us-central1-f...
    Nov 06, 2020 12:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:47:46.324Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 06, 2020 12:48:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:48:02.488Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Nov 06, 2020 12:48:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:48:02.520Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Nov 06, 2020 12:48:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:48:07.815Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 06, 2020 12:48:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:48:23.779Z: Workers have started successfully.
    Nov 06, 2020 12:48:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:48:23.813Z: Workers have started successfully.
    Nov 06, 2020 12:48:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:48:56.406Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 06, 2020 12:48:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:48:56.671Z: Cleaning up.
    Nov 06, 2020 12:48:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:48:56.771Z: Stopping worker pool...
    Nov 06, 2020 12:49:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:49:51.229Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 06, 2020 12:49:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-06T00:49:51.284Z: Worker pool stopped.
    Nov 06, 2020 12:49:57 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-05_16_47_23-11530165549556791506 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9de4e497-3e0a-456c-aebd-2c5a04b426de and timestamp: 2020-11-06T00:49:57.095000000Z:
                     Metric:                    Value:
                   read_time                    12.676
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 06, 2020 12:49:57 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 48.922 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 40s
137 actionable tasks: 103 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/iqkzpjqnxrdzw

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1203

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1203/display/redirect?page=changes>

Changes:

[Pablo Estrada] Updating BigQuery client for Python

[Kenneth Knowles] Suppress nullness errors in new files since last round of suppressions

[Kenneth Knowles] Fix position of @Nullable annotations since last round

[Kenneth Knowles] Exclude nonexistent org.checkerframework:jdk8 from all configurations

[Kenneth Knowles] Fix nullness error in Kotlin WriteOneFilePerWindow

[Kenneth Knowles] Allow checkerframework on API surfaces

[Kenneth Knowles] Enable checkerframework globally

[je.ik] [BEAM-11191] fix ClassCastException when clearing watermark state

[noreply] [BEAM-3736] Let users know that CombineFn.setup and teardown are not

[noreply] [BEAM-11151] Adds the ToString well-known transform URN (#13214)

[noreply] Merge pull request #13164 from Refactoring BigQuery Read utilities into


------------------------------------------
[...truncated 349.90 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 05, 2020 6:49:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 05, 2020 6:49:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT--GKR7rGhFyrwGu5jfuSBEbB_oZg2cVF_MtbZz7cz1Xs.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-4-rXNPMcyQIkhtX4UGT6tN1j0RbLhIv9FJI99OCswAQ.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7916492077330083055.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-frAFQosZNcNy-pnZvAPNKxp5pfm52r2JA1xikGutuvM.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-NGNmD2F5YR-wjI1a7rmJQWZ9nbRvZp7mTkvv4iPp5og.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-QaoeCRAKw8jSc1owuYgYFNkScp_v5ujIL_M69ubXylw.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-jlqzkosK-imilXdbzQ_7A8n7E7QvKZr-P-3LBXq6b_E.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-lk7nXGua4ByIxqrzI1JztKOtUvccT70jinBmofH8DpQ.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-cmiRDXO9k4QrajoNrrIF3n66f4WVvw8x9l7dGPoKhmg.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-SMKDsyFxOSQpYeYoaEJD20u2xv6OIc_0bnVhRRQekcM.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-UBkdYAtC7YegPSNpuuAAkPuKBNX3EU9ac4LQ4oZHSxM.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-7Z1zLaP_amwApZ4QphU2M553unuxINw2-rBmUU6m3us.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-WKCn9lB_wz-h7samcml2TPLVOmxWd1CBRVa6oJDYHXA.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-0mwxwqlU2TUsfePm8dzmHjFk1vco1-agWZ_cLudBtmM.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT--tYyDeMA9-9mhn-dxs1PX34h43IyzBtVOaQv9WclDr4.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-4kvYsmwbdleZNF_G0LM7UeVJUseX9FwyLglZXcZ44Pg.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests--q_B4lzKWywHyz9h1v5PhldHnOSv8xe5VEKZocqrZKk.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-DL-SMdFsdsmPg3kwN9jKUAg13S3ccO1NYQFiEHXkQvc.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-yV_9Dl1CluKgS9cCNI-NNM7TC9gvcTchcGy4famuAQw.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-VYmBxqx7VAYIdLmEf_deI5GIAWi6ELTEI_r4f0we2MU.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-UGDllWmRYJJ1Pq9i5UbNREoXM8zSHgY-4JTqILUNMiE.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-RhQrSmWqAuyhzzQgcdG9OeNrKtK6lkiN5AXpapBsrIk.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-f2bIrH63UVJDJZTocVt0yf-XeseOdwqgjRi3qp0_U28.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-bNSCo-cEnrByyJipwz5rOpD0emjZer2GQYSdprRHjeU.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-MQAiPMGPKtri8ppfASqk2nADnlhiCSMkmjhpxo37MAo.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-mBfs2I5HOyEW5ngH3xFs0kjQ7n05UI5bM5OtTtroHyo.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-pRJSShBXPk4JfOUEKADY_nWEAbUVuWPzkYqkEvRpato.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-ZPuLQz12XO70AOGS5Vyd3qH38kaFwGeaGoWBeBnp7HQ.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT--GKR7rGhFyrwGu5jfuSBEbB_oZg2cVF_MtbZz7cz1Xs.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-_rdnsA3fyLHIASelmWPi8JKk0OmkF_7qc5pFYksxIJU.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-_UQYqjHmJ_PUnJzEVrZgQ7rJMkfQ1FlIDT5if3yvpPQ.jar
    Nov 05, 2020 6:49:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT--Ag_H0bFlieO9QNxJ7vf2ZNemUk-k6ko9En6GXKPrMI.jar
    Nov 05, 2020 6:49:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 05, 2020 6:49:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 05, 2020 6:49:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 05, 2020 6:49:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 05, 2020 6:49:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 05, 2020 6:49:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 05, 2020 6:49:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash a6760f7b2201e86cb71be1c7c86ed603bacd3fe5e01c4d0decd21606896979fb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-pnYPeyIB6Gy3G-HHyG7WA7rNP-XgHE0N7NIWBolpefs.pb
    Nov 05, 2020 6:49:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 05, 2020 6:49:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-05_10_49_07-14785338458812205602?project=apache-beam-testing
    Nov 05, 2020 6:49:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-05_10_49_07-14785338458812205602
    Nov 05, 2020 6:49:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-05_10_49_07-14785338458812205602
    Nov 05, 2020 6:49:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-05T18:49:07.381Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 05, 2020 6:49:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:49:17.662Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 05, 2020 6:49:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:49:18.334Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 05, 2020 6:49:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:49:18.369Z: Expanding GroupByKey operations into optimizable parts.
    Nov 05, 2020 6:49:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:49:18.396Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 05, 2020 6:49:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:49:18.463Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 05, 2020 6:49:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:49:18.493Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 05, 2020 6:49:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:49:18.528Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 05, 2020 6:49:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:49:18.566Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 05, 2020 6:49:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:49:18.987Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 05, 2020 6:49:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:49:19.075Z: Starting 5 workers in us-central1-f...
    Nov 05, 2020 6:49:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:49:42.848Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Nov 05, 2020 6:49:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:49:42.869Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Nov 05, 2020 6:49:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:49:46.542Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 05, 2020 6:49:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:49:48.144Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 05, 2020 6:50:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:50:06.525Z: Workers have started successfully.
    Nov 05, 2020 6:50:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:50:06.562Z: Workers have started successfully.
    Nov 05, 2020 6:50:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:50:43.187Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 05, 2020 6:50:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:50:43.339Z: Cleaning up.
    Nov 05, 2020 6:50:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:50:43.412Z: Stopping worker pool...
    Nov 05, 2020 6:51:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:51:27.398Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 05, 2020 6:51:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T18:51:27.443Z: Worker pool stopped.
    Nov 05, 2020 6:51:33 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-05_10_49_07-14785338458812205602 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c1a373d3-ee5d-4438-ac32-c27971105bb1 and timestamp: 2020-11-05T18:51:33.646000000Z:
                     Metric:                    Value:
                   read_time                    16.405
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 05, 2020 6:51:34 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 40.999 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 13s
137 actionable tasks: 106 executed, 31 from cache

Publishing build scan...
https://gradle.com/s/daqyydldw3aym

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1202

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1202/display/redirect>

Changes:


------------------------------------------
[...truncated 303.28 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 05, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 05, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 05, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 05, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 05, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@209978778]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 05, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 05, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 05, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 05, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 05, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 05, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 05, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-eu4Y-Zz3yun-_iITlY6C7oL4f4GKfEPYoSYGrwY0XBA.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9000152385675639459.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-6OL8s6mmnJWah4KUVCYXb0DOagsVQR1_anaduOvlXKM.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-6dQhj23I7gVcUZwYFOMFaX4Qi4pu0mvQ-TRYdyhodxk.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-dPFO17tGIwQnZj0VuOSrpFIHf-ZB4OrbgXRChvszvVM.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-eAso5pPWpfqqmzXj6ViJMUC9DMC-QLKSx60qR910XAk.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Bfc8VDwmF4VMEzPZNRp3A2nY7lFf29strZo23s9-aww.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-PDIM883XBO2swZNRQ05KqqNGZbP9zX6alWWfINAIMoc.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-bV3IfWqPkBnbnPAVgjFtkJWmNLSCqLhAprwim6Dp5IQ.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-IgxWPt8TmFXwOEEdmNdAVaeuF5vsIAA-9v4lWKcI7u0.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-YntHU-nhv7Bawzlb2Bq3sh29ibjZ7yUOeeG8W3H_3w8.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-uiBU2-9yod2Jn4wsNcqqQy5HmvJgeFy24PhJNBsEYR8.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests--apMzeTFd78_75QPGsH1DQekzaBfBCEyycgfctygJrc.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-WxBwaJ4aZGjyOJnrJzWncFInNQO2fCTokeYwyBzv3Iw.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-D3VYOoVC0P7PWBzm8rLJznA0r5JosAO1mx8h9L9PFyc.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-mOYE3SjhKjXeZcVE6RctsMcaux0MfcqOAhsGq4XVGQ4.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-95W6uv2bUW5CoaD03zNq_dbwTSsuv23OEZBt0vQO1JQ.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-JFY7vq_waYoyjx55o3yGDOJaroI6s7cnNenQHxlszb4.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-6_bvDqFDyJ7xdabIVkI7iglzVJWvsR4HXcr4ZfU1fec.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-hhe3RqbH_6uNmktyn5NtNjz-FgYoLEtl7cpDBmYD-6M.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-gLW5h68869lNNWv4i7mrq3xNI2zO7Go77yzjmAS2ncw.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-pnehDnUOS6zSzHNmZ0fiONTDgeVOfsLpAj7SSnHPNSM.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-88343nZVPaPwlTdqok-hJMlzXLsNqYLZvhd0K1epBLY.jar
    Nov 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-hOJ_IuZWWKOrJ0wR_EOsBImMwLuW2uPSQw71ctKqrRM.jar
    Nov 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-BehElJqYh049445h1NK2MXoqwgzmE_3-P78EWV4Yn-Y.jar
    Nov 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-cO1GzLXuHqJgzTkrSzD4uSb8QViwACSwGMhIQ4owuJU.jar
    Nov 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-oG8omb8D1FfszvkZ6tG32H7ayFaYTdKoPr4jABjXO0I.jar
    Nov 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-bqd4Vrtfx1xGBR7FkkMD3KrGF1-Ze-C4nA8dv0P1fBM.jar
    Nov 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-eu4Y-Zz3yun-_iITlY6C7oL4f4GKfEPYoSYGrwY0XBA.jar
    Nov 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-CLxC7x_ty2qP5sTDnmfC8vyCqwuZcAS5o9tqogmezWg.jar
    Nov 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-gXezbkTpORW2wytlXbYf_cUJWQBMYvJJaSyCLR8Bkp8.jar
    Nov 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.69/6cb063f1d527ff65bdbb9ea74888a5ffc3f92197/fastjson-1.2.69.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.69-KniRdEoDeOAJmpuB3b7U0uGn6GMvb4b4_8-jg-UeScg.jar
    Nov 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-n9j75bprhik-23L7t2afP5fVTSABwB8V2GrqMnIYltw.jar
    Nov 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 195 files cached, 31 files newly uploaded in 1 seconds
    Nov 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 05, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 05, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 05, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 05, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 05, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash caa89f6af415c331a5f0b40e6027b6eaac23f99b01e59eef92047574a54d52fd> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-yqifavQVwzGl8LQOYCe26qwj-ZsB5Z7vkgR1dKVNUv0.pb
    Nov 05, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 05, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-05_04_45_24-7376980958611219194?project=apache-beam-testing
    Nov 05, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-05_04_45_24-7376980958611219194
    Nov 05, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-05_04_45_24-7376980958611219194
    Nov 05, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-05T12:45:24.515Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 05, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:45:34.009Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 05, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:45:34.792Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 05, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:45:34.837Z: Expanding GroupByKey operations into optimizable parts.
    Nov 05, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:45:34.866Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 05, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:45:34.943Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 05, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:45:34.980Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 05, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:45:35.014Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 05, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:45:35.143Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 05, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:45:35.512Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 05, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:45:35.594Z: Starting 5 workers in us-central1-f...
    Nov 05, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:46:02.479Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 05, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:46:02.996Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Nov 05, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:46:03.018Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Nov 05, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:46:13.578Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 05, 2020 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:46:23.145Z: Workers have started successfully.
    Nov 05, 2020 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:46:23.181Z: Workers have started successfully.
    Nov 05, 2020 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:47:00.478Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 05, 2020 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:47:00.688Z: Cleaning up.
    Nov 05, 2020 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:47:00.819Z: Stopping worker pool...
    Nov 05, 2020 12:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:47:55.929Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 05, 2020 12:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T12:47:56.021Z: Worker pool stopped.
    Nov 05, 2020 12:48:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-05_04_45_24-7376980958611219194 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8fe5bbf6-ce48-4fa4-bd36-6824dc97f019 and timestamp: 2020-11-05T12:48:02.446000000Z:
                     Metric:                    Value:
                   read_time                    16.082
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 05, 2020 12:48:02 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 54.656 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 45s
112 actionable tasks: 71 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/qdf356xd3m4rw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1201

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1201/display/redirect?page=changes>

Changes:

[Udi Meiri] [BEAM-2717] Implement ProtoCoder.to_type_hint

[Robert Bradshaw] Allow use of index as series.

[Robert Bradshaw] Allow setting columns.

[Robert Bradshaw] Add utility to test a set of strings.

[Robert Bradshaw] Add a proxy for panda's top-level module functions.

[Robert Bradshaw] [BEAM-9547] Implement pd.concat().

[Kenneth Knowles] Add class-level suppression of rawtypes errors

[Kenneth Knowles] Enable rawtype errors globally

[Robert Bradshaw] todo, lint

[noreply] [BEAM-3736] Disable CombineFnVisitor (#13266)

[noreply] [BEAM-11196] Set parent of fused stages to the lowest common ancestor


------------------------------------------
[...truncated 305.08 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 05, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 05, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 05, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 05, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 05, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 05, 2020 6:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 05, 2020 6:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 6:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 05, 2020 6:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 05, 2020 6:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 6:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 05, 2020 6:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 05, 2020 6:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 05, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-HMkrq-Hum3ETwTmygFg-XlpHEn7LFA7-s-yvU4s2BQQ.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-biqX16Ev9GxzqXPtyZ2_KCC4jSabTx-9ECpsJqUZAVc.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-cDevVncgQXwcFHWNxfuwBdJ9ebKA_KzWizdCq-Y_R0c.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6106218676643640974.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-yH6Z9OW9wSu9sy2NaY_nXR7J9NfM2JFWSKASsJfcRtU.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-rVtbTPzX8Ncey2J_w5c56cnl2Hc3a3FKy3uuacxwDqg.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-CFFBKIY_afE0FkVH_ElaB0Br7mnKawHx5xWH0OcP2gA.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-CHzG9Cyg5iRCOuaJDumzEzS3rUIe8FTQXMNfHi8gK-Y.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-zZpiRg9Nri_67UAtXNUo7tqso7-DEerTMLuF976EwIQ.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-yXCvwC7oTN8P1kN3pA_didNaEWGtj9gG9Q8ZLD_YAS0.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-k8buq_1sUpsR9b6T295HlKt_LDsYsf44wQjS5ZF2hEg.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-1ZRKkjrJzfh7LP7oWu1ZIzmyfeZRzr381hva4hAD8yU.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-FSnfGQmQ7xtiwXOPmxoS6lTYPnfTuVP4Fdx2PvOpipA.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-VWww__Rhobo2QdflwbXKAYYxPoDieY9yFDAZb36O1Io.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-JeTpAXnMVP_f3amDDs-keDx1AcXArWTm73-sRx1x8Vw.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-UXt0ADC_zxv3MLTHqyf-c8UNRD055R6xkRmi4DOEi3s.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-hc8Xabn_J-HS7yVya4dVqi8elP10vWikDlE-2c6FXyE.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-J22T6qx2hRna2imgL_U0Op-OPTm0xgpXPJjTSTlSRw0.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-7g0P14-1-oADqYseVudT4KbN-vd9ipTbNNfWU7BHJ5Q.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-jdxKI6nsLw4I04OEDWa-PzFZQHR99Lw4fdD_GsuAmcc.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Jwu7IWxZnVyBW5mtvyq4cj4R7PQfBlM7Z1ohrMHGEFo.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-AeSlx7hasYHvPe0VJ0OLR2JT_xXMQfyOsu0aeacBXM4.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-SJfhScwfLECVYJAjywpXwZPCUDx46BoZwmymbEACl68.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-s9VPrH27ctoRjeinXzUgYQh2I1oDQe19Mw-grTv-3RY.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-4EJNGA34eTiIiiK465f27TrWOgSxVGSLFYcD0NwVldc.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-l_wGabrTIoPhEzgN1QsosinY1zR1Qz8bdfXvlCuC73I.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-ZzH31rUuqgilOUlD6MVTEzvrbaiLUK07NJFxBFloYLo.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-F9gQ9Nh84fOywftPuomSw-AJM7lWYFVUkYuTL4z-X0w.jar
    Nov 05, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-HMkrq-Hum3ETwTmygFg-XlpHEn7LFA7-s-yvU4s2BQQ.jar
    Nov 05, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-Ed0IorHKMyjLsWBYYzVFwhR-qce4uSrT3fYVf6y2JiA.jar
    Nov 05, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-pX0NkliWhIrAyRX5jPk28izhrV8JvvtMO6w1gqbHBGE.jar
    Nov 05, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-kIRVtMFUHmE0EJci2Mb5-7axavjgsRSKPZIPY-pmqOE.jar
    Nov 05, 2020 6:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 05, 2020 6:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 05, 2020 6:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 05, 2020 6:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 05, 2020 6:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 05, 2020 6:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 05, 2020 6:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98995 bytes, hash eba53b3094b535328fee5b87a696fe349bcbdf00c81373a0f95e02c2eb364c56> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-66U7MJS1NTKP7luHppb-NJvL3wDIE3Og-V4Cwus2TFY.pb
    Nov 05, 2020 6:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 05, 2020 6:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-04_22_45_42-4847622068833025548?project=apache-beam-testing
    Nov 05, 2020 6:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-04_22_45_42-4847622068833025548
    Nov 05, 2020 6:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-04_22_45_42-4847622068833025548
    Nov 05, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-05T06:45:42.567Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 05, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:45:53.530Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 05, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:45:54.385Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 05, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:45:54.647Z: Expanding GroupByKey operations into optimizable parts.
    Nov 05, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:45:54.748Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 05, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:45:54.898Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 05, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:45:54.927Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 05, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:45:54.963Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 05, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:45:54.998Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 05, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:45:55.450Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 05, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:45:55.530Z: Starting 5 workers in us-central1-f...
    Nov 05, 2020 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:46:07.691Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 05, 2020 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:46:23.562Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 05, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:46:44.343Z: Workers have started successfully.
    Nov 05, 2020 6:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:46:44.403Z: Workers have started successfully.
    Nov 05, 2020 6:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:47:24.377Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 05, 2020 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:47:25.376Z: Cleaning up.
    Nov 05, 2020 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:47:25.506Z: Stopping worker pool...
    Nov 05, 2020 6:48:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:48:16.770Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 05, 2020 6:48:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T06:48:17.254Z: Worker pool stopped.
    Nov 05, 2020 6:48:25 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-04_22_45_42-4847622068833025548 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f74ab7c4-aca2-4356-91c7-c970d3036ce5 and timestamp: 2020-11-05T06:48:25.678000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    19.265

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 05, 2020 6:48:26 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 57.973 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 9s
112 actionable tasks: 75 executed, 37 from cache

Publishing build scan...
https://gradle.com/s/6up2ap4sjqgqe

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1200

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1200/display/redirect?page=changes>

Changes:

[Pablo Estrada] Adding display data to BQ File Loads transform

[noreply] [BEAM-3736] Add CombineFn.setup and CombineFn.teardown to Python SDK

[noreply] [BEAM-11190] Fix grouping on categorical columns (#13256)


------------------------------------------
[...truncated 301.89 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 05, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 05, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 05, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 05, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 05, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@209978778]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 05, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 05, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 05, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 05, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 05, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 05, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 05, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 05, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 05, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 05, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-69QFMTzA6kyDtjuo1LmWDA9s4PxRmEp83wxR1CdptYo.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5001688513253122980.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ss5E3NToq85V5zfp3oUL69R7FRedYedAFk5KtuLQEJk.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-yNday6KF6N5Kg3FjZj4oSbv9_w4GwTZvNdGSy0YW-vM.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-cRECsotX3Pik18AYgZLCkQ3O5dWN9SpwkQa_yZd-hx0.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-kEcfi9u1MLhdgWfU4kZBunI7NugzcPNNpszxXVOiSJc.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-rWocI4KmJatoBwixeh2XJMEl-Fu8XJ-m28hhb4bhum8.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-lUpRVu1DVcFldqXBKBKChp0p0lrBMYy5aw0osNBvDAQ.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-vJRzbjBWUQcwWz3bfu8KnMvTqFlgUEEGAqaen5OvcQI.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-cQXy1Zv9DJumMhCfw_WFlHroEg5q-1ou51quNzFubgA.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-T72aI_xkFB1RzwflUL-CB96ed-C-cPO2HkbSwABPpVM.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-DKLALN_khC-DsgJlZ1xr6I6-wn-t8lcw1syW59MK8J4.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-vb9-exza4LbUQ_KOzU9BxqwmfPYEMvMnYMWjhWzBJZ4.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-Hc9RfZBUn2icWWznxkQeb-1cjwqO7DuYq8Gz7vWg1Bo.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-i_0LMgLiqe0-g6ucbXKIeLW_LlZ8p5I_SZ-QPU5-0qM.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-JgRCYhO-cHc8FzRcZpHfqr533Pg4hfMeo9z6vJx7Bvk.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-cHz3W9DFfgAnFA_xtGOdt6qNQ_28mgGatccyU3Jke0Y.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-69QFMTzA6kyDtjuo1LmWDA9s4PxRmEp83wxR1CdptYo.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-qLA8m2wr-cijA-p6msrP4rv9XUciYvkDjQUQApgmaS0.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-gtprfLwGdjn0PvPtLz_BvkwdfdbtAikuBYEO1YEFysg.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-FUv5gOc9w5TjBFFzQhOIOtDQE62iLKs1IQGz48QKPX0.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT--RV2EFC0xBpcqFG-YTjg8rdK9qKQWaZ6RVXeRNXFFrA.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-7VOnIK44CPCr8KiUUyJHN0PZNkLq-qPNvbWYaQNv_G0.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-GYlX5XQtqYztx18waa0nMHGyKwmll96E2f1TA9Q8_G8.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-CAISh7AXqRCLWop3iW6Bw8oIM_dyMgFxhXCEIV0ku1Y.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-SKnex9E4cz-zm4oScz49e1Jm8z3w8pvzYllOj-p7z9k.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-W_UpViYdft1LgJB4CE1ET4CeWiPs7WAthjc3so-EFRI.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-f8o9pJ26sWnYN3lVdCOs51wmJZmORGmDw9nxmFbU6Yw.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-xR0tBlCB_40Iax9YHEu41rqAhrVVsZzaTDGFdrG8NkU.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-BeEAUXGswSc6VmCYPuuKjyXe1xwOUFNjJ7DuR1idTZU.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-PNwyS14ZwNJLvCVgT_TqwxlyPSFLTX1m3gDpyXpWi1k.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-hNh7Zg6RYC2Oj_VZqefDlF8Brba5-LjJ51JR6Sjrv_c.jar
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 0 seconds
    Nov 05, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 05, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 05, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 05, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 05, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 05, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98992 bytes, hash 644513cc987686157ea3375f977b85530c1d83e76e19740b3586e37ac8567e72> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ZEUTzJh2hhV-ozdfl3uFUwwdg-duGXQLNYbjeshWfnI.pb
    Nov 05, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 05, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-04_16_45_25-14458202331586084358?project=apache-beam-testing
    Nov 05, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-04_16_45_25-14458202331586084358
    Nov 05, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-04_16_45_25-14458202331586084358
    Nov 05, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-05T00:45:25.530Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 05, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:45:36.449Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 05, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:45:37.453Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 05, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:45:37.514Z: Expanding GroupByKey operations into optimizable parts.
    Nov 05, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:45:37.548Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 05, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:45:37.716Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 05, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:45:37.770Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 05, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:45:37.841Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 05, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:45:37.898Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 05, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:45:38.386Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 05, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:45:38.494Z: Starting 5 workers in us-central1-f...
    Nov 05, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:46:07.253Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 05, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:46:16.057Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Nov 05, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:46:16.115Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Nov 05, 2020 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:46:34.787Z: Workers have started successfully.
    Nov 05, 2020 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:46:34.852Z: Workers have started successfully.
    Nov 05, 2020 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:46:58.593Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 05, 2020 12:47:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:47:15.532Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 05, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:47:15.777Z: Cleaning up.
    Nov 05, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:47:15.884Z: Stopping worker pool...
    Nov 05, 2020 12:48:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:48:07.852Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 05, 2020 12:48:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-05T00:48:08.009Z: Worker pool stopped.
    Nov 05, 2020 12:48:24 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-04_16_45_25-14458202331586084358 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5a017a39-09e3-4d7e-8381-989497a23be6 and timestamp: 2020-11-05T00:48:24.764000000Z:
                     Metric:                    Value:
                   read_time                    19.589
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 05, 2020 12:48:25 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 3 mins 14.654 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 8s
112 actionable tasks: 71 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/mh2nvyvhtkqqq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1199

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1199/display/redirect?page=changes>

Changes:

[sychen] Fix GroupIntoBathces.test_buffering_timer_in_fixed_window_streaming

[noreply] [BEAM-11162] Fetch missing projectId from options (#13234)


------------------------------------------
[...truncated 304.07 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 04, 2020 6:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 04, 2020 6:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 6:45:47 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 04, 2020 6:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 04, 2020 6:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 6:45:47 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 04, 2020 6:45:47 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 04, 2020 6:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 04, 2020 6:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 6:45:47 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 04, 2020 6:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 04, 2020 6:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 6:45:47 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 04, 2020 6:45:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 04, 2020 6:45:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 04, 2020 6:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-uUbDFWSl4nuNZJI3S-5l2JeCdeTkgd-WN7m8SnVnW4E.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-PFXNWfGcf0D9cTnZsuikcLNQVm5lAYRaJ8K-HD2dVBw.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Kf5Fp24NHLdrxPxJ8SIbvXeg4XDLut8QALJDSSsGoT4.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-lar5reZtR0DEeVFwEDz-exOiAjfMffuBDsy2o8aK1PE.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-GrfzI6Q-HWoyqgf4E1RgVoBTQjHGBQKe7Kx62DCuLks.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-FrXeBENmmrNhAfABEKVUo3sppzeMY0eG-14ChXbd1jg.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-2-7_1gGgyeiC9vUiYFPg17lblkIKqhBPtfMwyak8Jb0.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4807789417195119782.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-iPYnkBDHxo9fK-PEZ8nC_R6vrD8epFaWMEG5zSdic_Y.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-muyK8_XgAMbSY4A6aDL_ScxmATt3Ynjn0o_w_vp7RNQ.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-6DNXGdfUsf9NyIyIGAR1vYrQZKU8nBEM0QfFucnBOEI.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-pm7TZbptoDymgu8Xmv2nv2Q8EmSKNamGYSUm-Lpd68U.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-aXATaC2z3WJThjrbqoo_VqjQ0kI5lVyOxOyw36uAGsc.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-MdjVHSrV37bw-DWedyH3WZ4_GkIpRKGLbdcLtQSHUgM.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-hi5r9cSIr_rg9egspVzs7Ilodg8InaVX19kCpF69tkY.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-dAC_KoyXBwZ-PO0fy5_aELhfaQb0xzkpzNjpD-pTSfI.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-yaxgB8p0pGui6Gq0MgBdz1kxqtnJ7Toc24D6JzPG_Dg.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-dpCNLdODIMxSv1z_ItwiVrZBkXVJVi3hjDHzE6j3hxE.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tTNk_7sj-OzkT5Awts64M8H2SZTcpW8aEVqnEi5rRBE.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-y79nzRTfu6Al3b_57qWxPiDfVKLKhaQmqj-mMAW0_nE.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-7RXdblfetj-3CIKs-TsB9fw_xFDnjy0uye-7x8FpECU.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-Hm40EdsA2q6Jat5Bm5losYntAVzRktgZwBqWUuxk6Bg.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT--aD5hbXQNh2biMbnBavVo2YtGmk3TQV3h4rd96dJFuw.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-M-faAc94lCnMpuSaJ2jXwwzhfKua3JwzZVTy2gKqa2M.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-PcbWfNPRfVO_vCU0nxy8B1tI5sZ-5yJMYi2pBuwqMbE.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-xWRz-nqC3W_plpNkchZ19qqqbkFnyw5Lv1Lf0jXkXWo.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-_jjqB5lbZGjAM0-B_W4w4ooU9b4cUF5-420bWMHGO8A.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-afxm9GQINOTLf2vFR3dDfm9WmTRYy9bNNsjLbf_xDMs.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-uUbDFWSl4nuNZJI3S-5l2JeCdeTkgd-WN7m8SnVnW4E.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-FZH4e9bU44knr2zbNuJUyujA67zU_NR1fsjqEzAkYG0.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-y-mGVaf5rTShGyLNjRNMu6lf98Gbrf3_OsGiuI1-VwA.jar
    Nov 04, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-cXkMLE7J6EJI_nE26aqMAfZOAGRsgE1AMSkmJNcZwk0.jar
    Nov 04, 2020 6:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 04, 2020 6:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 04, 2020 6:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 04, 2020 6:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 04, 2020 6:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 04, 2020 6:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 04, 2020 6:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash 5c806977e823fd28bbfff2abaab822900d9e32e5de2a45b9eb039f17ec90a372> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-XIBpd-gj_Si7__KrqrgikA2eMuXeKkW56wOfF-yQo3I.pb
    Nov 04, 2020 6:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 04, 2020 6:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-04_10_45_53-11695685347421003203?project=apache-beam-testing
    Nov 04, 2020 6:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-04_10_45_53-11695685347421003203
    Nov 04, 2020 6:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-04_10_45_53-11695685347421003203
    Nov 04, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-04T18:45:53.844Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 04, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:46:00.626Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 04, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:46:01.483Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 04, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:46:01.525Z: Expanding GroupByKey operations into optimizable parts.
    Nov 04, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:46:01.562Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 04, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:46:01.648Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 04, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:46:01.685Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 04, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:46:01.713Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 04, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:46:01.761Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 04, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:46:02.688Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 04, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:46:02.773Z: Starting 5 workers in us-central1-f...
    Nov 04, 2020 6:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:46:27.851Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 04, 2020 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:46:35.923Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 04, 2020 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:46:54.140Z: Workers have started successfully.
    Nov 04, 2020 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:46:54.171Z: Workers have started successfully.
    Nov 04, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:47:30.418Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 04, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:47:30.656Z: Cleaning up.
    Nov 04, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:47:30.739Z: Stopping worker pool...
    Nov 04, 2020 6:48:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:48:19.828Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 04, 2020 6:48:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T18:48:19.898Z: Worker pool stopped.
    Nov 04, 2020 6:48:25 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-04_10_45_53-11695685347421003203 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b0a1b75b-9a5a-459e-b6ff-708ebb33476e and timestamp: 2020-11-04T18:48:25.728000000Z:
                     Metric:                    Value:
                   read_time                    14.112
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 04, 2020 6:48:26 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 46.848 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 9s
112 actionable tasks: 74 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/jrtd664lo6sq2

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1198

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1198/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11091] Allow to specify coder for HadoopFormatIO.Read (#13166)


------------------------------------------
[...truncated 301.51 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 04, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 04, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 04, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 04, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 04, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 04, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 04, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 04, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 04, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 04, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 04, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 04, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-2rDhFNw29rLbp6yL-dDPfNLWcmbY7NBN1E_mx86PJ5w.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-sxpe39erxyE9ywzRJpeHMD12_dq8aTEJAssjVfgpCgc.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-udPMcklUIk5FEQ2JtoAZZInt9VhS_pR2AOl04U6SeHc.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-5tKzyJfqZtbYs93KHIZQiw8NjtA5Aq5YIOxgb4pBEHI.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-OhIluH2hy07bq70RvIHw9MtLWKoipzsqSzZe8YyBx2c.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-GmJzGiEQ7jw6w2FQCY6w7GUIiRoA_W4OBVJnmX5_MW8.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-04CIioMFrGAbaOFtWbXydkOKSKJrmZpUQcwCI04q6vU.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-zIPgkRQsoXB16F14p3jw9IGVUQuvfkupInHYzPf5xdI.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-5DqrBTEDUhnb2r0niQUciAAoiEZmHhlsVXHRipUW5O4.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1577745655575185458.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-7L5gUIRzuZQvh-uCwfIzvGPQW0gwyNIcz2Tq8lmvAqY.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-qQpdx2HEXKQMHtQLT4JtqkqKXYQDFPXos2k6khJDIbw.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-aHJTtCRzmi063vkxksAUj5h9ZJfCMeuiB3c4xWBEVBo.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-1osYtnRgppV6utWptxTM4_LLxOAk3y-W0Oxwv84Ul-M.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-bspOS_1smO_EqMAO0ctHLgiK66hCgqV8DcemcIb5MI0.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-JvuyP7Z0WN_8TfQmDQ5ASoQ42OqGj8PhUsOvdNQkFn8.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-s0zxF4YLpzRqst_lK6Awo2HP8ZvsBOybgC4KLpK2q38.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-ja_KoIgKDNM0znkCfkZkmsN3Fd-LytWWxedQShCx_Vo.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-siUWZqaJ0yb1yGuJ6tzLb3SRjgBFBbVDFuWFWAwVxks.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-2rDhFNw29rLbp6yL-dDPfNLWcmbY7NBN1E_mx86PJ5w.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-DzcPGpfIxofuYBeCLrols1Ir9skfMJXiWgBZouaDTP4.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-J0KCojc1mipPskq-1UxeRRlZzLutmdd4v2G1vlI6n7Q.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-1OKf98HtdPrs2Dfh9wOF79rNKfFK2ZvJ2bn95l0jOCQ.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-42Cu93qFKe-P-eCphy6R4aUUshuFYWimVfbamdRoF-w.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-ul61_KyGGDu2L5X_2zSaK4kv9Y92slrtQUSByP8lnWk.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-LkfTsctgeV-0m_rplZLex560mCw6YqXJLf9kWbsFark.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-wDQkerpX35EIqqMeXszH5hk-a8u1vyBQxfouCtVXkMU.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-NGilsIqxYBKrNiOl8INGdOlio3VbeTUTycg6JYanhyM.jar
    Nov 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-7typM6cPGIVRTC2WNA1qwk-Fnu9brz3FoLi5Not_ghM.jar
    Nov 04, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-OEEyANBzTGrttnesronENcuw9ewQ17xyuc8eoOXnmA0.jar
    Nov 04, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-Q6yyF69Eh8wt7QdGq_mdFl9X-yS1UH6QFCwCLptSL7s.jar
    Nov 04, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-JcUo8h8ppJKIjyI50js_bJsHadEkJX84bqIMWnAxGXw.jar
    Nov 04, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 04, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 04, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 04, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 04, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 04, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 04, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98997 bytes, hash e01c49095e22f293b9653953041c943a9e102f578cb6621553549a8859bb2b3f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-4BxJCV4i8pO5ZTlTBByUOp4QL1eMtmIVU1SaiFm7Kz8.pb
    Nov 04, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 04, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-04_04_45_22-571482045498068693?project=apache-beam-testing
    Nov 04, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-04_04_45_22-571482045498068693
    Nov 04, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-04_04_45_22-571482045498068693
    Nov 04, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-04T12:45:22.293Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 04, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:45:30.872Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 04, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:45:31.767Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 04, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:45:31.814Z: Expanding GroupByKey operations into optimizable parts.
    Nov 04, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:45:31.861Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 04, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:45:31.956Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 04, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:45:31.993Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 04, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:45:32.014Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 04, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:45:32.054Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 04, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:45:32.599Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 04, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:45:32.689Z: Starting 5 workers in us-central1-f...
    Nov 04, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:45:50.207Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 04, 2020 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:45:58.385Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Nov 04, 2020 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:45:58.418Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Nov 04, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:46:03.755Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 04, 2020 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:46:22.120Z: Workers have started successfully.
    Nov 04, 2020 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:46:22.191Z: Workers have started successfully.
    Nov 04, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:46:55.555Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 04, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:46:55.754Z: Cleaning up.
    Nov 04, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:46:55.855Z: Stopping worker pool...
    Nov 04, 2020 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:47:44.983Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 04, 2020 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T12:47:45.029Z: Worker pool stopped.
    Nov 04, 2020 12:47:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-04_04_45_22-571482045498068693 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): bee22d3b-f7ac-4234-a16f-50c6bd1c1f9a and timestamp: 2020-11-04T12:47:50.484000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.171

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 04, 2020 12:47:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 41.724 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
112 actionable tasks: 71 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/zesciqwteisw2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1197

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1197/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11164] Fixes bug in beam.Partition (#13236)

[noreply] [BEAM-10409] Remap all PCollections in KeyWithNone elimination (#13204)

[noreply] [BEAM-10124] Compute number of records before each offset using a

[noreply] [BEAM-10869] Remove unused PubSubSink with_attributes property (#13254)


------------------------------------------
[...truncated 310.38 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 04, 2020 6:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 04, 2020 6:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 6:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 04, 2020 6:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 04, 2020 6:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 6:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 04, 2020 6:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 04, 2020 6:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 04, 2020 6:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 04, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 04, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 04, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 04, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 04, 2020 6:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-l-K0SMAKFqELo23Rn5H1hbqWoFBd0of1EzcU87oLLAQ.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-bhmdKtHemNVyGDiC4xhLLI4z1RrmaZBVRAqSBhuokqQ.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-4XtcwlVbcR1oMhn4Ey2k6LWxGJUqG0EBZBx_1cNYCwg.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7644056767435405616.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-utO3mlIODLvdnn2biTzEcF5F96uK0bHowrwr7XsGQzw.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-rB8CiUuosjUzyspUVWw2GCY2E_hZxL911irLSoQ3uIE.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-efAXKVy9_s9RSqELEapkZeKFy8xzJlAF_t_VHHTAkxw.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-er5q23eLg9s13xwnnJPGvknphibTCy7qrK7pbXdDS6s.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-uZSdarM8bc6jfGwD--vLsKjqcoKz3iBI1UAy_XB-4Xg.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-ChTUqA8PPvyd5wN-_lhOZGdpzYiXSCZuKbGddEsQhis.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-2DQM2SWHarRgtDEbDZkQGikY03biM00BeR7cfzLzBYs.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT--_B5cCnkSSgfeZocOvdoiN6hDKPhhkn21sEqafHRBbA.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-SnNY6bwzgo3MuIzuFobwddr0zPIH-QbikXz4vc_YJYE.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-m4kKsF-L1RP6RNmUCWW1zphZCLHq8wM0Rey9XKBJVJo.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-E4S8RUX_2r8H7LJax79MX9AEVAHOdhOnoiOB92ISk4E.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT--KwAu36Bveyz1PixWit5zjY-gNCP3QTi98OVhpVHWX0.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-1FExWir0amrqMLDAdkl45-wINP0Fv7_49CqO8X3bmPk.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-4wvumMQIOjZKnWfHRB39ZIDFMtw5AJZUP88HtvrIa6I.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-yVllimeF2xYdTYQ2q5xwA_57AQlJqRRes3l90CqRVTM.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-mCsNc-7n1yCbkAR_h1bYEkRzkosI8mwsb56Z-GlBNCI.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-Krxm8HcbN4kS2D8EPwtFpLxjaJhlUy9cy9UjZjbk0Xc.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-l-K0SMAKFqELo23Rn5H1hbqWoFBd0of1EzcU87oLLAQ.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-RzE49fjmhQaUKkp0Kl2LuID-KUFPNkGf3o5f2R6QA_4.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-TYGl4rd1s2ZWt91sR2lMudUOOww1t7S_okahumq_eRI.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-yLi77IiHy_nvXfcGaDQt7P9R6czT4txBKX4DIjsOCCo.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-8UNIZ1GKgBfle4ga72ARyB5Ky7g7H7QCkik_acTMeO4.jar
    Nov 04, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-ubrRKSQeD9iICtGF3mTd8-qBVfqS3IhMUk46qo8fOf8.jar
    Nov 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-NAYJMm9drJ07Uh5GpURcQHqpUZKydDebPcQpox_KM4k.jar
    Nov 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-h7wuiGpEuR8WkZeDH7djOJJ8GxLfjkBj9BqL8KprTPA.jar
    Nov 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-yBpFa6zyPLy7WCM0HmeHV701UeafEh5ueRwbUE4k_T4.jar
    Nov 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-Iqaf59O1ygl2BpgDrOBWjicG0lfmpFJvW4KVF7-w5os.jar
    Nov 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-j43M172blxcfMrk13AqlTiVnp7JSE9U5BryQvhlAI_8.jar
    Nov 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 0 seconds
    Nov 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 04, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98997 bytes, hash 0d39782b8defa57585f0caff2d8bd28c01c5599b1626cea1950229d52b790dd4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-DTl4K43vpXWF8Mr_LYvSjAHFWZsWJs6hlQIp1St5DdQ.pb
    Nov 04, 2020 6:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 04, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-03_22_45_48-4017236563655202754?project=apache-beam-testing
    Nov 04, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-03_22_45_48-4017236563655202754
    Nov 04, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-03_22_45_48-4017236563655202754
    Nov 04, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-04T06:45:48.252Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 04, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:45:56.793Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 04, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:45:57.876Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 04, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:45:57.926Z: Expanding GroupByKey operations into optimizable parts.
    Nov 04, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:45:57.966Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 04, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:45:58.042Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 04, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:45:58.078Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 04, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:45:58.112Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 04, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:45:58.134Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 04, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:45:58.619Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 04, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:45:58.698Z: Starting 5 workers in us-central1-f...
    Nov 04, 2020 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:46:23.955Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 04, 2020 6:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:46:29.046Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 04, 2020 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:46:41.204Z: Workers have started successfully.
    Nov 04, 2020 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:46:41.233Z: Workers have started successfully.
    Nov 04, 2020 6:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:47:25.466Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 04, 2020 6:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:47:25.665Z: Cleaning up.
    Nov 04, 2020 6:47:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:47:25.750Z: Stopping worker pool...
    Nov 04, 2020 6:48:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:48:16.035Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 04, 2020 6:48:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T06:48:16.088Z: Worker pool stopped.
    Nov 04, 2020 6:48:27 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-03_22_45_48-4017236563655202754 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): addc4876-c895-4e8b-869f-25694cbf853f and timestamp: 2020-11-04T06:48:27.208000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    25.192

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 04, 2020 6:48:27 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 53.07 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 7s
112 actionable tasks: 78 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/37ozprpxhdwac

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1196

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1196/display/redirect?page=changes>

Changes:

[tysonjh] [BEAM-11130] Exclude OrderedListState category for Dataflow V2.

[samuelw] [BEAM-11144] Fix trigger prefetching so that the correct trigger index

[Boyuan Zhang] Exclude SDF test suite because it requires support of self-checkpoint.

[tysonjh] Add Dataflow Runner V2 ValidatesRunner streaming test configuration.

[noreply] Implementing Python Bounded Source Reader DoFn (#13154)


------------------------------------------
[...truncated 305.46 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 04, 2020 12:45:52 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 04, 2020 12:45:53 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 04, 2020 12:45:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 04, 2020 12:45:54 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-EgHUx2Z56Ydjkn3ChsKKKAoQ9AEgoqXbWELFG2WO-D0.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-74tWr-pGvoA9YeW_B-Lpr49N2oiT7zcKg_5EQy5ewsg.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-ySxsGh8p-8JfQDIjEelMjnB0I0G9YMfCkk4PPjEUadQ.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-g0c0j0yGPA1rVrxvQV3akUb6C_1lgKknjrsVlKBI03Q.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5218177187781020577.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-v8c-TUB628Sbstk7Au8XM8wzo9Qi0YTg0zwylgu-Xdw.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-OeQqLOuS2UCSTwrZYAWDl0IUbQ5YQw3vhNtjr_-o-PI.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-TIUFVCXfu1YensR6w9JcEYRxyHWCt3BvLnvDuHEP2W0.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-keG2xHahb2KpYye5oJW-_Ma81EPG_h0MSj1SEsYGFJY.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-AP0tOcC1PG1J91JaOSNQKYHZCdbM-yx6E1fGel1ck5s.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-J-sgDgI6B_XQT5vjoaxWcHubAh9eQpYrBd0_qCFsEIM.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-zwRbyBCSBPSpDbgSGUd1tgGXmjTuvTDF7TIkI-KK7lQ.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-D0GcKD97uNhzv0GewQ4WbuMoopjoUAHCr-ATMael5Rg.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-pnF-OtYsDTUGgfJCb73urEAR9EBj7qsyUSUQe_2ETX0.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-DHcZpNtjSXUpvhDIgIZylbyhS9KOY8MbhqxAn6IpR1s.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-98h-vZgedCZtFioTqIHIcEmTYFnnqQ6T2lWcBX40m4E.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-T4fZ9Lj3_tGDILKTe-zqhaXj4uXlG8z5onXdBOPKWM0.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-hHRinVBZoNheqUKwdKvb1A8dd7goqykfyVqrI6a1HHo.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-xdGf9bEXe-oMRdWzybDv6GM_otC5JBg9Jt6GxtmGmfk.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-qgmpkLC3BZQK_o9iYz8a8wTH6PM4bLUBeRE6LjI-L5k.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-49ah4Itl1inA1klR8FSi2xUOroo-GuTLKaFDwvlntHE.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-Q7ld2_9AMOSyJ6OlXjh65Z2sG0rRNoGXL8oUqOjZz8c.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-LSvfDPgrKa9BQ1zvt7FRxnpaUcHd-1UCydJSnQ-Ym0g.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-XSxL_-1Lg_h5XImBKKjQqQFtiKxucYcblp4cQ_JfjO8.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-kc3Ted01uDK4rwEXylrZjnjwgpvpaSkc--lQN7FRSoI.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-2AtVgKSO5H7sF3Nh8xQm6auKLVNu8_dJsoOUi3VcgMI.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-sb-JJZUfvFaEDc-okjYPF8YiGIZ2UiXRfAwG6RCjHGY.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-hsHeWaSHWR-_0OW82JJ7TdVzGZQEuV8dMsQVTiz_Eh8.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-EgHUx2Z56Ydjkn3ChsKKKAoQ9AEgoqXbWELFG2WO-D0.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-6PHxgxeCvT_HUBy3SzMuYm_GYRuom1LJ4PgpUi4xZbw.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-QDVx8LFAf_1Na1vhUR1ro0zkU0Dqg2eYmJptG5LBUwQ.jar
    Nov 04, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-Ntvt2eCFdzPkqXnZoiUAptb_3KlPyBgVQEefKuiOVvY.jar
    Nov 04, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 04, 2020 12:45:57 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 04, 2020 12:45:57 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 04, 2020 12:45:57 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 04, 2020 12:45:57 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 04, 2020 12:45:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 04, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash 12870263eb409b3c9624d636a861dd2920200966cd63e6c94b11063ba27c184c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-EocCY-tAmzyWJNY2qGHdKSAgCWbNY-bJSxEGO6J8GEw.pb
    Nov 04, 2020 12:45:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 04, 2020 12:45:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-03_16_45_57-16921251368171212624?project=apache-beam-testing
    Nov 04, 2020 12:45:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-03_16_45_57-16921251368171212624
    Nov 04, 2020 12:45:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-03_16_45_57-16921251368171212624
    Nov 04, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-04T00:45:57.683Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 04, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:46:05.856Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 04, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:46:06.465Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 04, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:46:06.498Z: Expanding GroupByKey operations into optimizable parts.
    Nov 04, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:46:06.527Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 04, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:46:06.597Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 04, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:46:06.632Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 04, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:46:06.667Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 04, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:46:06.691Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 04, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:46:07.156Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 04, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:46:07.226Z: Starting 5 workers in us-central1-f...
    Nov 04, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:46:25.175Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 04, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:46:34.227Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 04, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:46:54.528Z: Workers have started successfully.
    Nov 04, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:46:54.554Z: Workers have started successfully.
    Nov 04, 2020 12:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:47:26.888Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 04, 2020 12:47:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:47:27.027Z: Cleaning up.
    Nov 04, 2020 12:47:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:47:27.108Z: Stopping worker pool...
    Nov 04, 2020 12:48:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:48:18.627Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 04, 2020 12:48:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-04T00:48:18.671Z: Worker pool stopped.
    Nov 04, 2020 12:48:24 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-03_16_45_57-16921251368171212624 finished with status DONE.


Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ae6769d8-8c89-482c-ad77-aef1431e38db and timestamp: 2020-11-04T00:48:24.209000000Z:
                     Metric:                    Value:
                   read_time                    11.802
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 04, 2020 12:48:24 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 41.8 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 7s
112 actionable tasks: 74 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/u6i42ez26f524

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1195

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1195/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11146] Add fasterCopy option to Flink runner (#13240)

[noreply] [BEAM-10123] Add commit transform. (#12572)

[noreply] [BEAM-5504] Change Pubsub avro table jira task number in CHANGES.md

[noreply] [BEAM-5570] Update javacc dependency (#13094)


------------------------------------------
[...truncated 306.13 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 03, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 03, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 03, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 03, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 03, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 03, 2020 6:45:37 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 03, 2020 6:45:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 03, 2020 6:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 03, 2020 6:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 03, 2020 6:45:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 03, 2020 6:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 03, 2020 6:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 03, 2020 6:45:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 03, 2020 6:45:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 03, 2020 6:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 03, 2020 6:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-7n_74IOolZSvD4-DWrhKARxF-g5W1sOG5rw_rRzcRh4.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-tINEogu4juOSD4wQXVsX0bUvSrCRLfz8J-ctFKI36Bw.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-_FCjcx5diso4_Bci2es9Aywv_K-uLPHC4ZnplJq7Re4.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-IHfj2KADRwv74NUi0MWHNn0FiYUCmtvoqISRKaa_bmo.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-wt2NDijnK1djJONV6KNEl8zC3ifq4zr5X1CITuVf5jg.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-UNRLB5ddwCHoVhkuialxQ24jM5F19Y5WKjPLX0nBGFI.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-jFUQGlIDaSrJg4B14hbc_KhTidGc7_kLS8pjER6XWXM.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-IDKNqXlx2UaTYl7yxemlzm9gBiGDZ0GI8d-gt-AEiAk.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8568205400936520099.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-dmoNoEg_7VZ4UYApvOcSisXBNpPKgvJhj4hrvjcIEps.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-C-1KcLa5ZaTUhy_8BWlrH54RrWhyAckvr-9wftTqe60.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-RYk5UXfrsb2WFt2PdW24b9PGRg_StKgz1OszjPdBsyY.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-u_LsDAvK4w5-SrasaDPM836TDkDKI8ckf23mHvufauQ.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-00u_c1IVMXUGlTP3os8gRmyx2TvZ0rMSIRmbTzWNgZ8.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-Oqh0vIhDY3vlRFxfNHKk3RmX5o9Z-jt_jds1ywzZ6Dc.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-xVmmA2qz97uISLtjN6MeuZRskqlyTUqLp-hcBXXbiQQ.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-f5b8TeRiUJPzDOfGw8kvKQaykNOymC0gi7abRcNandQ.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-gl8D-p15V4KoIpBl9f0KYjB4iKseshF979rFf6os-FY.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-ClPKKR73ex-hcFWuB18Ryz4eD96g2j-e1kCYhVn42wU.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-igsj1yzozIhRrXymSz0McLNqL4d9GELl0ulniYSW-Fs.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-cbvFrEulSw0c4SXft52lmd-a38eBneXWZKPEEc6Xx0s.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-7n_74IOolZSvD4-DWrhKARxF-g5W1sOG5rw_rRzcRh4.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-rjJ3jbzxNnux8hpGLf3wH8kg1fY09BdgxhsiKOs7C5A.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-5B2co_yB9Fd7Z_RCUnJQ__BItqMOdVsBMIS0cg3aKE4.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-6Ds5jSvUdbShLF7DiVADR4qHmjMYYJPm1uy22yjtad4.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-XmX5dgPpS-ACBRQ826WypWEwy387IkEbOpx1z1MSgZk.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-412EVcWcwW_fzkUyA8Urj8eGfg7aFGUfXkAKPDFNhsY.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-ueZdtYRvUDJtFr5p3JlWIomYJDxsJpQ6LuTb_ZfnPvk.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-3YViQ5QXTDsVLRpqtjVLr0Gw5x33VkZE1ImsNxxHeiM.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-D031YmrmtM_h1aRKBXy-em0sOMH3TJIXF1k2eYV74Og.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-hAdtgq8U2JXlGgxkPHwfJ4L5L_noNmrqGa4kGEOFVmo.jar
    Nov 03, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-ZpJf48JUURgF57AoKHeuSW3SUn5R2KMOpOMKBPzCQac.jar
    Nov 03, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 03, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 03, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 03, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 03, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 03, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 03, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash f42e6bedb076d1582e329b2b9785281599d549b5f5e74ccbbcdac1b8440269f6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9C5r7bB20VguMpsrl4UoFZnVSbX150zLvNrBuEQCafY.pb
    Nov 03, 2020 6:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 03, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-03_10_45_45-5815919766066310093?project=apache-beam-testing
    Nov 03, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-03_10_45_45-5815919766066310093
    Nov 03, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-03_10_45_45-5815919766066310093
    Nov 03, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-03T18:45:45.251Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 03, 2020 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:45:54.240Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 03, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:45:55.019Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 03, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:45:55.095Z: Expanding GroupByKey operations into optimizable parts.
    Nov 03, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:45:55.118Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 03, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:45:55.197Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 03, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:45:55.238Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 03, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:45:55.261Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 03, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:45:55.295Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 03, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:45:55.743Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 03, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:45:55.837Z: Starting 5 workers in us-central1-f...
    Nov 03, 2020 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:46:07.073Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 03, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:46:25.571Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 03, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:46:47.894Z: Workers have started successfully.
    Nov 03, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:46:47.928Z: Workers have started successfully.
    Nov 03, 2020 6:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:47:22.950Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 03, 2020 6:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:47:23.108Z: Cleaning up.
    Nov 03, 2020 6:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:47:23.334Z: Stopping worker pool...
    Nov 03, 2020 6:48:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:48:37.524Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 03, 2020 6:48:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T18:48:37.573Z: Worker pool stopped.
    Nov 03, 2020 6:48:45 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-03_10_45_45-5815919766066310093 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 598da0e4-346b-4997-b316-07d20c25bcd8 and timestamp: 2020-11-03T18:48:45.861000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.023

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 03, 2020 6:48:46 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 3 mins 16.952 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 29s
112 actionable tasks: 73 executed, 39 from cache

Publishing build scan...
https://gradle.com/s/346n7t364gm6k

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1194

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1194/display/redirect>

Changes:


------------------------------------------
[...truncated 388.34 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 03, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 03, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 03, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 03, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 03, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 03, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 03, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 03, 2020 12:45:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 03, 2020 12:45:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gI2hYDbixQU7IQayconZiYPf0VfuVAEgj9XKlRfka7k.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test835926229068978291.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-acORhH_8ius1YpiIdKZrL73XaeWLhkhdtdyKOA8_roI.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-XQtYJMDr-gMuxVQhc73Modyt3DBVsrjtVvB2663YNiw.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-5nUGtbQmkpOtHAiH6IsZ97ja4RcBlK3XNjmJGDuaXQY.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-FItBs4h_1N4snhiUALffqqjEBxsyBRgg4UPDLDhbj5w.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-b3NI8ArFerwTBM8i8DT9lqM_ylHZwQH8OooLNPy1QP8.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tY-NoTR33Iy6JLg8M4vu54B1V4CmuAF96WQWn3Mwywk.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-tJAKsr3lSauJ85HbABUsml0qoWTGDQ0nA1uzGLUw58U.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-4jDkvy1ZaeZff5utzADMLVhf0jXJPKjx9rBFF6N4huo.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-60zFB7JqbKiQtSY59grytpcskTuLbx9MirKJMGdr1ks.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-_naNk5R3Ec8DSBSNAiuFpa0ac6rC7YSn6np3BxEmblM.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-idH61pgfzt1af7mklZLmVMAlU4hXIHadRslJUwSL6Q0.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-q4QUX2YCwl2AwwukmuGY1zYrmix8aNubqf6meawObt0.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-5ptDSrNMeVhwwjzGKt1jZvbrJqHLbd3U4sceQpAye6I.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-gkSWOa2LEF0mTlt6GMSYRlo2Yhc6CvJDibt_idMCUBs.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-gPWAfyyQb_gSWlPcAz2wqGkfXRXEiUJAEym1_91oYgI.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-OwdF8kmHaRaxUU9taxRDIn5ogxjlDvKN9u4VQCfW9NU.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-5thKlx8GBc6iE8gxQQ6EjDB4ho5GTcEzST17azKa8G0.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-ERYjDtfCfJVkbTPx8FZ5MMNxkQl6G767e_DdlsNnURs.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-v1UdgB5p4pgJ-Q1Al712mpyEdyRG7y2N_--YHD_BD0o.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-jXyAIDvUs3sUbpO5mBbKN2Ssp0vvb_naIZLZ66ycGRU.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-tqbqwiqJbtvQCDlFqVsTX__dsK-GNlSy-BmvU4lAyTI.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-BsGVP11kDDJWzQmyHbjHvfDlc40CyHmAVt7sfliD1a8.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gI2hYDbixQU7IQayconZiYPf0VfuVAEgj9XKlRfka7k.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-wXpeZZACb1E4B_zlyUMUcmbkucNER65o3RFH25JNwSw.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-RmKJ47XpBHOhhOJcNtymbgempOyENx7uCGy7n1oUqgs.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-WrHMcm4gRBn-eGjlpKU6n2ZnYLV5_6o32BDiiKFAPyY.jar
    Nov 03, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-YeIGGgltihoXfloRgm9VNSMrvb_J665PXQKNcZLr5AI.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-wx0wUBDs18tXRiQdIUKwa9cTyhteelV_p9I6rnaKu2Y.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-_BbI6pnXSawfT-ek6OAurRUq93C0K27hvAlxIoI1cEw.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.process/2.1.2/986b38302fa10018d5baccee7f655c31ee9afe5b/de.flapdoodle.embed.process-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.process-2.1.2-OasY7D5KRAimcZcWcjFwgi8Qb4B-iff1FfrVeNSih6A.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.mongo/2.2.0/781d14f4e3d9eeb0b4c3e00a4ec165a04b3dc5c4/de.flapdoodle.embed.mongo-2.2.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.mongo-2.2.0-vNy3lJC0jW9u4Cy1AHsqSbjRUqOTX9ycpEmHkht7vvk.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-csv/1.8/37ca9a9aa2d4be2599e55506a6d3170dd7a3df4/commons-csv-1.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-csv-1.8-qL1WZS7UZo2dWjOZSuUvWbnjnI6w68tmhOaK7udXmmE.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-bW6kpQpkWYCHITqXh5sbx37T2U4YxekXjwaGXz-zIaY.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/commons-compiler/3.0.11/f2a6ec7fbc929c9fc87ff8bb486c0574951c5b09/commons-compiler-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-compiler-3.0.11-DxpPXyZccBoxkzJErnBF_O8YtPpZUEF-Je5wvlDd2s8.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-calcite-1_20_0/0.1/6d16a59dc771784789116607a04acd9a0ef91d16/beam-vendor-calcite-1_20_0-0.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-calcite-1_20_0-0.1-1NrX_9FNKiEqNk5qBOaRlj-IwqOvKvQIGIbTVgm_v8Y.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/janino/3.0.11/e699e368095379ba0402ea1780a87fcaea16e68f/janino-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/janino-3.0.11-kje3HSMpGA5ZIQ6aqhAO4xNFTvCuWIYIx1yxkxlZG-E.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4-runtime/4.7/30b13b7efc55b7feea667691509cf59902375001/antlr4-runtime-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-runtime-4.7-KmGUP4A7vR0OAt_9GbkqQY-DNAyZQ0aAnjtR4iMapsA.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.glassfish/javax.json/1.0.4/3178f73569fd7a1e5ffc464e680f7a8cc784b85a/javax.json-1.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/javax.json-1.0.4-Dh3sQKHt6WWUElHtqWiu7gUsxPUDeLwxbMSOgVm9vrQ.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.abego.treelayout/org.abego.treelayout.core/1.0.3/457216e8e6578099ae63667bb1e4439235892028/org.abego.treelayout.core-1.0.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/org.abego.treelayout.core-1.0.3--l4xOVw5wufUasoPgfcgYJMWB7L6Qb02A46yy2-5MyY.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.ibm.icu/icu4j/58.2/db9fd4b4c189cf1518db14c67d14a2cfcfbe59f6/icu4j-58.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/icu4j-58.2-lT4eg7K-fD6i-I2obBNhT0fp5x01eMhSHX8Yd1a2OWI.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/ST4/4.0.8/a1c55e974f8a94d78e2348fa6ff63f4fa1fae64/ST4-4.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/ST4-4.0.8-WMqrxAyfdLC1mT_YaOD2SlDAdZCU5qJRqq-tmO38ejs.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr-runtime/3.5.2/cd9cd41361c155f3af0f653009dcecb08d8b4afd/antlr-runtime-3.5.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr-runtime-3.5.2-zj_I7LEPOemjzdy7LONQ0nLZzT0LHhjm_nPDuTichzQ.jar
    Nov 03, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4/4.7/cd6df46532bccabd8127c18c9ca5ef481962e931/antlr4-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-4.7-eGclcCizNzrwEd7nts6bWHqP1cegsl9osv9MuQvoqgc.jar
    Nov 03, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 183 files cached, 43 files newly uploaded in 1 seconds
    Nov 03, 2020 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 03, 2020 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 03, 2020 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 03, 2020 12:45:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 03, 2020 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 03, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98995 bytes, hash d919a7398e68f82fd86b381befe89fe5ec5ffb6700c8c80dcde27040f60c203a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2RmnOY5o-C_Yazgb7-if5exf-2cAyMgNzeJwQPYMIDo.pb
    Nov 03, 2020 12:45:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 03, 2020 12:45:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-03_04_45_55-18406604342106238925?project=apache-beam-testing
    Nov 03, 2020 12:45:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-03_04_45_55-18406604342106238925
    Nov 03, 2020 12:45:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-03_04_45_55-18406604342106238925
    Nov 03, 2020 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-03T12:45:55.151Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:03.225Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:03.868Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:03.907Z: Expanding GroupByKey operations into optimizable parts.
    Nov 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:03.939Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:04.010Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:04.032Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:04.057Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:04.077Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:04.407Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 03, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:04.476Z: Starting 5 workers in us-central1-f...
    Nov 03, 2020 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:31.578Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Nov 03, 2020 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:31.611Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Nov 03, 2020 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:36.914Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 03, 2020 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:38.057Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 03, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:51.037Z: Workers have started successfully.
    Nov 03, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:46:51.075Z: Workers have started successfully.
    Nov 03, 2020 12:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:47:26.534Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 03, 2020 12:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:47:26.685Z: Cleaning up.
    Nov 03, 2020 12:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:47:26.765Z: Stopping worker pool...
    Nov 03, 2020 12:48:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:48:18.791Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 03, 2020 12:48:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T12:48:18.844Z: Worker pool stopped.
    Nov 03, 2020 12:48:24 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-03_04_45_55-18406604342106238925 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 75c4c599-d2db-4b0d-abe3-062b70fc1fc7 and timestamp: 2020-11-03T12:48:24.402000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.893

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 03, 2020 12:48:24 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 46.865 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 8s
112 actionable tasks: 75 executed, 37 from cache

Publishing build scan...
https://gradle.com/s/ti5jmz5dt73d6

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1193

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1193/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10892] Add Proto support to Kafka Table Provider (#12838)


------------------------------------------
[...truncated 307.97 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 03, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 03, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 03, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 03, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 03, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 03, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 03, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 03, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 03, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 03, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 03, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 03, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 03, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 03, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 03, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 03, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 03, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 03, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-RVFcGU0rCEZI9oLkW0Wtm0uJXHzmTkCLh50qyCFfsC8.jar
    Nov 03, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-yQVPu__4KUyhw2yiIdnX58wIzfZnjhVzA5cRovk2TD0.jar
    Nov 03, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-LT6-mWFk_IUCPfempr3t-U8p2Un2NrCt29kXR4mleC0.jar
    Nov 03, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-xOVdXHmaLH5UpGpJ3G5kjIUm-CIZs_BftX558tMrxog.jar
    Nov 03, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-tQtU13Zg6Ec73Y2fWT7gUCz-_SV2m61miLv0zuYZnSo.jar
    Nov 03, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-vfaoaLiatcoK-BU1YXu12qvcrQYwWpeRNl0fLY9Qc7I.jar
    Nov 03, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-MRG8VcPu7F14ZUHWnItqEcwcTMyxlUk0yGgbyvQHf5I.jar
    Nov 03, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-VHjCqQOvUTXfKIcDJF4V04aDx2J1zP84v8sGwSzX2OM.jar
    Nov 03, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-B0qKWUEvhANZ8DMy_3DXDm_QkNDgLgIjHMG-DGoQNNM.jar
    Nov 03, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-lizyO8XuPGXjhvAxb-9-NCMrriP07I-Kc7IH7-GrjDk.jar
    Nov 03, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-wJmMGl1jy4Cqnvwr6L4W0BJbRAkdfRfYOa4WmNOw9Z4.jar
    Nov 03, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-cpqgR9vBkZkJBnoCkbFA0ep5zDOZ2BK883ZM_MAPJdw.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-aGP6FfJfrtCrrWEuxTI2zrJadkCO_6TNp4L1FjXYmmo.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5429322929832824867.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-XcVXD48L0XAhvT7YrBzb5aDERs9y-yxLIxvOZXWZDuw.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-E1Fw24GPCaOzQxk8i1JHVZYrWnxHq6x3ZSQ26FXwnn8.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-T99VVkZFjKHLuIFPjMIT2OVNSFFtGyBvdLW2PnrGH60.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-ovj-APyWorIO7bkqI9s6f8sCLJc8BDwHJrD7WaQU5sE.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-OCM7lMcPLFfvrlgpoZGePhzF_Mnf_FlYfw8DKpzbE4w.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-0U78W9d0lngGZYsQsqSp3ctTiK0HIOPkHY-KZ69dARg.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-VpF0vaJD_2lST_mpgFYDgHB1qPCvnV51ZdP59CN2nU0.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-GUKohEGcAxLCJAB1wMg5ir-nUvsOsXoo6rnYpDzaOns.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-P53zVZ_VV0FKj9JMuYQ5ppLqX2lLwSCFvZxpZY6-cOk.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-1YGLUUsEc4by13AwfZyu5wkKF18F8ctT2SxglgdTSsI.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-fmM3tGKOYxnJtZDY8AKus7rAMHXnWZM3WWrwe7nb3pc.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-RVFcGU0rCEZI9oLkW0Wtm0uJXHzmTkCLh50qyCFfsC8.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-prVjtTHO9OB1A7IE3FNJF0aikqJ8k0P6wYgEpgWr-7k.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-E9eiKbToDPktiChjLyScDkRoo4gPKevMX-_aWwdAnMo.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-hli_C0ziglqKUq28vbUTg1uiEQMTg7Pksb6QdunnOj8.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-pqgpDME5e1ess0d1rRU5TKBCBvzGZyH-ODNlYo5MzjI.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-MfowR3BkoNbPAR0lOU1L88PxscM5g3-l3kKlHzcKHfU.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-FGs2hb0Zcltt9II8jHsSgOGIxVsxnT0tb2fGdLUjFGI.jar
    Nov 03, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 03, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 03, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 03, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 03, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 03, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 03, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash 2bdf3b022705bac66e5c0f24edc5a10ba0362df66adbe1523ff7d5de132efe75> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-K987AicFusZuXA8k7cWhC6A2LfZq2-FSP_fV3hMu_nU.pb
    Nov 03, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 03, 2020 6:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-02_22_45_45-17059911589540492002?project=apache-beam-testing
    Nov 03, 2020 6:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-02_22_45_45-17059911589540492002
    Nov 03, 2020 6:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-02_22_45_45-17059911589540492002
    Nov 03, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-03T06:45:45.676Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 03, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:45:53.567Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 03, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:45:54.323Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 03, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:45:54.357Z: Expanding GroupByKey operations into optimizable parts.
    Nov 03, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:45:54.385Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 03, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:45:54.474Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 03, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:45:54.503Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 03, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:45:54.536Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 03, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:45:54.570Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 03, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:45:55.134Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 03, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:45:55.211Z: Starting 5 workers in us-central1-f...
    Nov 03, 2020 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:46:11.586Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 03, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:46:23.050Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Nov 03, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:46:23.087Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Nov 03, 2020 6:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:46:28.488Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 03, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:46:42.676Z: Workers have started successfully.
    Nov 03, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:46:42.717Z: Workers have started successfully.
    Nov 03, 2020 6:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:47:19.398Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 03, 2020 6:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:47:19.798Z: Cleaning up.
    Nov 03, 2020 6:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:47:19.885Z: Stopping worker pool...
    Nov 03, 2020 6:48:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:48:04.438Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 03, 2020 6:48:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T06:48:04.496Z: Worker pool stopped.
    Nov 03, 2020 6:48:12 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-02_22_45_45-17059911589540492002 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f038158d-e579-46fd-91e8-2a98f5d01384 and timestamp: 2020-11-03T06:48:12.519000000Z:
                     Metric:                    Value:
                   read_time                    16.207
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 03, 2020 6:48:12 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 41.465 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 57s
112 actionable tasks: 75 executed, 37 from cache

Publishing build scan...
https://gradle.com/s/7wh7au4dtwbz4

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1192

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1192/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-11143] Ensure that AfterWatermarkStateMachine clears late trigger

[noreply] [BEAM-11154] Check coder proto to avoid registering same coder under

[noreply] [BEAM-9444] Use GCP BOM to set package versions (#13075)

[noreply] [BEAM-11052] Memoize to_pcollection (#13066)

[noreply] [BEAM-9547] Auto-populate any unimplemented methods/properties with


------------------------------------------
[...truncated 281.71 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 03, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 03, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 03, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 03, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 03, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 03, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 03, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1452961901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 03, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 03, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 03, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 03, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 03, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 03, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 03, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 03, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 03, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 226 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-cAGUvTU1skTKbc23iDV95Gvv88vrg0EKMvemEgRLR5k.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-lcrhbgIXXTwlRMo3JT257U1NaNAoj-4Jjs6Stg5hLFA.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-DCmU2oyKoJtNz8gOIU8KoDYeBEbAj8m1ciGAQdFohYA.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-zNxcNmK0nk6AySy23IuEcvPltOZSRME9Mo96Tr374xQ.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-nUfWYqAR0GgtEFq6oiaDZtfJMwYkpFJzRROimaJ4YX4.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Ks1TSnIEeIEml127xP_L-c3hd4TJN51kWgm3cP6cjtY.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-dI1Pp9KS7nu5Dgr43Com5pAdfZXqVv_IMMUZTQtL17Y.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8538717854921880601.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-4-JnsCSn_xJ3Q6lu9Sl__R8PD1SbMEJ7Kt-0Hj0RoIU.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-gmSNKPm3CM3mNjrhy8nbwcqX2U_Rm7Vh8Vr9-QjxEO4.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-zWEY_r90LjHcLYPCvsPMOqDk4lYPEaJpaepkvXj_7Xc.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-9yGV4ESgO5ndqIZdGeHKYS2ljIP0KjrcoWNxrVuBeE0.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-QaxlFHbiWkbsi-l_ZMQ64HCRMUL0d7vA0kko-b4Wf4I.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-fqUk9seAS17oglNuhGW1yxT8uJ_h8khCQujARCOUtyE.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT--R-ckAxGQxf1vfmlEufTZzyhuqUe1S4S8Yy55ztwRdc.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-BIyFkr03eGRMXarFXfo3IL26ljDfY3uqW2AGhixDFcU.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-ZZnOQ2Gr4f7MjykjZENcwaFt-EbI-glHWGwptz751AE.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-gt9oybxIq2gz2iTpiVzRkdE_-ktnMzQd4BWv9o1dGnk.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-Dyb_guBty857zdMiEInekrXDVpPBkANcjKc0HvN8brc.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-oAhWxdLSIKFfRyRyYMiCKmwLXxkso-ajTCD-89W8SLY.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-cAGUvTU1skTKbc23iDV95Gvv88vrg0EKMvemEgRLR5k.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-w5L3dGcUNnG4xk5_RDDBz-AWNW_7MkGm83abOv5_QKI.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-mk5qMTExa_4iP2s6fc_OWYJncOzmn4j610igxyOy6xk.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-M8ftKWuo0CWFH2WOxjnxp6XOBUS7Ihy7imoBwQ-2Nqo.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-Fdq2TlqVYPYs72tsdZb8zmbHTgE6BZ2aC5jNWddA18Y.jar
    Nov 03, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-psZwR45492ITkly1rB0bAzWxx-eCO6R1F_bpK3v6Aro.jar
    Nov 03, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-bdJQqK5QLXmp3UXcnAbBiPtAuTkLks_mvamcqcZZ_Jo.jar
    Nov 03, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-oyoBp4nx-b-Z8cXJ37WPSZicIRaIb9ronVBpCh8qJ9M.jar
    Nov 03, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-IfIWxDfzf6ebQA1YIMqO70FjUgYMVov7YbbbOd7M84w.jar
    Nov 03, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-OCBAnef2OsB7ciflG3FcWwNPtfORSavLzzPWEcjby-k.jar
    Nov 03, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-F3Bpd0AUmRZ9pEK8XVALvwqNja4Dot47VRqBuyHbJm4.jar
    Nov 03, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-fiODDJvUPa3GegkG0zg9qkdH7K4Q9HjJrSaGVGj66b4.jar
    Nov 03, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 196 files cached, 30 files newly uploaded in 1 seconds
    Nov 03, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 03, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 03, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 03, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 03, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 03, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <98996 bytes, hash 69fa4665af67c26a5ecd9a0103df9e0b15938e6f9380558ba9a5120ff320d096> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-afpGZa9nwmpezZoBA9-eCxWTjm-TgFWLqaUSD_Mg0JY.pb
    Nov 03, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-02_16_45_47-2639746578161055731?project=apache-beam-testing
    Nov 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-02_16_45_47-2639746578161055731
    Nov 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-02_16_45_47-2639746578161055731
    Nov 03, 2020 12:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-03T00:45:47.457Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 03, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:45:56.774Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 03, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:45:57.362Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 03, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:45:57.402Z: Expanding GroupByKey operations into optimizable parts.
    Nov 03, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:45:57.443Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 03, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:45:57.520Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 03, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:45:57.570Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 03, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:45:57.605Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 03, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:45:57.636Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 03, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:45:58.066Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 03, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:45:58.165Z: Starting 5 workers in us-central1-f...
    Nov 03, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:46:05.869Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 03, 2020 12:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:46:26.987Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Nov 03, 2020 12:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:46:27.032Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Nov 03, 2020 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:46:32.341Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 03, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:46:47.033Z: Workers have started successfully.
    Nov 03, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:46:47.066Z: Workers have started successfully.
    Nov 03, 2020 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:47:23.732Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 03, 2020 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:47:23.900Z: Cleaning up.
    Nov 03, 2020 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:47:24.002Z: Stopping worker pool...
    Nov 03, 2020 12:48:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:48:15.246Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 03, 2020 12:48:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-03T00:48:15.315Z: Worker pool stopped.
    Nov 03, 2020 12:48:21 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-02_16_45_47-2639746578161055731 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c0e3cbd9-d861-4386-a535-db8dcb60487d and timestamp: 2020-11-03T00:48:21.795000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.847

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 03, 2020 12:48:22 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 50.845 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 5s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/wjo5wauynhwn4

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1191

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1191/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-3736] Construct empty global combine values on workers when

[Robert Bradshaw] lint

[Robert Bradshaw] Order stages according to data edges as well as must-follows.

[piotr.szuberski] [BEAM-11160] Fix HadoopFormatIOIT

[noreply] [BEAM-5939] - Deduplicate constants (#13142)


------------------------------------------
[...truncated 272.88 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 02, 2020 6:45:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 02, 2020 6:45:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 6:45:54 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 02, 2020 6:45:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 02, 2020 6:45:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 6:45:54 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 02, 2020 6:45:54 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1847488743]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 02, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 02, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 02, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 02, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 02, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 02, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 02, 2020 6:45:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-2tV82XRYkff_oR_4F-YrNzm1584NUgV9Q6nDGf5syFs.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-njAQARift53Y0Sobn1N0mG2mQ3GHfrHJz6nP9i1Sw7s.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3071924034613473036.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-wV8NyHaZjP8Co15zRxe6nYNHYnigjFSL6fNNlRkE0oA.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-YF1lVXVm2xsga6X5HZYCmdIwGsjoSfV07SmYi3iwWQ4.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-7cCd-EEcR6wAmfAsRlCQKE9vGp5L2ddy0hWqLUel61Y.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-aV7WJ22rQoYaaUF_tTqPbxm08DFTI_3IahtIuyRRH04.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-BfVypzekjlFkFXU0BZ-s6KxwfW4_pkKipsZvOf6FRYY.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-glVJU2IfrV75JT-_1QoqyUV-MrCyWerowbo5gJKFtJw.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-xqHNf9Z8be1lfbZOO9ukECSTXXTsaKB4-juvr54hkXU.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-ESQRcQLw5DTFUszW7CsZtxDNDT-te9tAPcG-98OVsbA.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-2sBN0lYhgO7LCz76zyXiuxjBwbq2XNzHFXD0IBuVLhU.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-1Il6Y3KqFo0G0EeJKkvmIWLJ8eK2jK1bvTXQcg_BZT8.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-cwcgl42h6qgnSPs8IQcvSIQBPqjijhRKED9eIgPlYpo.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-Q9_NTL84ScAnXzWnEO15A2US5M9YvbpGTFAueiEvark.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-6asaFYpsb7szEvriCo3UOe8rflPBI1csGMrNKkruJ2A.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-ZIQ9TcIs-hcdCMuYJW5jDyUAbm90LTXvZxbgIlSyyRQ.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-2tV82XRYkff_oR_4F-YrNzm1584NUgV9Q6nDGf5syFs.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-_eOdhDr3mgzglmHx411evFYp9cLxCEd_Ki5vL12TAkU.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-IGe0jrBNlEF62EetSRGWHHRijSrp61NfKByzIIVbQoQ.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-GDLDf85VG6wsuujwiN7ea5nWEgSI8uI4Ba2qexV_7FM.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-cwJJrPKhnXMjdycE2yZSQ_n7RrPWjsCfqadvEO4S-Xk.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-t2tn4xIX3F_0QDeC0CFofncIqWSELQ8JZrL7mBHpSMs.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-brNEN3ZtbMFFZp_r6jn_JouUS6sdPw6NF7IgXe8yecc.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-NZFUd5W4wwt8bJIgs9dkwG8uxXiinJpK6VSOcv4aRw8.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-1OXnCBUU5DHCM5SVA0G-iA7qjznVKS5dUf3FlBolTsc.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-UdyzYcOcfKso8miNY1BRv0PmxeDaRVluM1tVqL2Ogvg.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-elIWTeb89u83znQOJINpsRkOyFVHUT0er7EF-JUyELs.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-MTj6RyVPRw4oUNWBo9WndCwU9hbM5Lg5HK-46ab-CUo.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-gP-8NPfYLaiZUZzx0X-FA-VY6negOpaPkQTUpTmhIhk.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-wTfo1dDQaWUakmKIw4zXIQmEDnqGbvaMh_WLmG9ds_M.jar
    Nov 02, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-0Et3HkityNle4yLsNNYOwXsmZLQ46Gm-JmUUArTlGKI.jar
    Nov 02, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Nov 02, 2020 6:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 02, 2020 6:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 02, 2020 6:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 02, 2020 6:46:00 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 02, 2020 6:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 02, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96046 bytes, hash 9c3fd9b0bd08c392fc15c84fadebe779b635bf2d0ae0b867ae1412bedf54eb7d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-nD_ZsL0Iw5L8FchPrevnebY1vy0K4LhnrhQSvt9U630.pb
    Nov 02, 2020 6:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 02, 2020 6:46:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-02_10_46_01-2416910734539891812?project=apache-beam-testing
    Nov 02, 2020 6:46:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-02_10_46_01-2416910734539891812
    Nov 02, 2020 6:46:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-02_10_46_01-2416910734539891812
    Nov 02, 2020 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-02T18:46:01.092Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 02, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:46:09.656Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 02, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:46:10.363Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 02, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:46:10.407Z: Expanding GroupByKey operations into optimizable parts.
    Nov 02, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:46:10.447Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 02, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:46:10.526Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 02, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:46:10.563Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 02, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:46:10.601Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 02, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:46:10.647Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 02, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:46:11.097Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 02, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:46:11.166Z: Starting 5 workers in us-central1-f...
    Nov 02, 2020 6:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:46:17.143Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 02, 2020 6:46:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:46:46.860Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 02, 2020 6:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:47:04.951Z: Workers have started successfully.
    Nov 02, 2020 6:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:47:04.996Z: Workers have started successfully.
    Nov 02, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:47:44.357Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 02, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:47:44.570Z: Cleaning up.
    Nov 02, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:47:44.691Z: Stopping worker pool...
    Nov 02, 2020 6:48:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:48:53.400Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 02, 2020 6:48:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T18:48:53.458Z: Worker pool stopped.
    Nov 02, 2020 6:48:59 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-02_10_46_01-2416910734539891812 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1e4af594-b5d3-4e46-ac46-c000801a2d7b and timestamp: 2020-11-02T18:48:59.909000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     16.34

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 02, 2020 6:49:00 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 3 mins 14.123 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 33s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/lme3qajebyeek

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1190

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1190/display/redirect>

Changes:


------------------------------------------
[...truncated 273.54 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1284831599]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 02, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 02, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 02, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-LlIkenf1PN1-e8hFfKEFpYHG3YtYRcSvN1cF3PRWLPo.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-APn0eiELWPQfpwSO-xEBAElYS5UcTUaHiDgIKsc4FeU.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-7gsgkWoNrxGQkOJzxpvnlL3tqRyGKeHdZSK5Go0qbBM.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-94isWtgoAjPxHt0o8bYihGk4lZ3SlxCRdYEyr1WF3bI.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-UBppFZ3JYSBWU7g1VUIpJhKy5NyZCw_EM-w17O8aq-4.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-dote3xTX-wo6GeNro18gJ0xU8WS_dgk2sMw2gkIOsEs.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-2tdX76TYL3DHOB-wUYmFlqePhNErFUnhSNvnPVdzynY.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7654182663968393490.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-tY3UZ0L0wPX7QreallIHAhEo8yAdyV5XBL3pbd81bVg.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-6KA_4KDm2NsqHNUXSysG7X4rPP_2CGueh9aabKznKfk.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-hICEUjjDfFkCQsm6ii46UdnGn66DilWR_VMHbS9v4pU.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-gMoqY0NLek1GdSimhx_jzSgvlwTDev5wPMfDDE12gvA.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-L-332JpnFuUmZuF1CvJ9Sv4cHKogExAxQ8namFRYcZg.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-oK7ksHl6xIYzIwP-6-rtG5HUvOR4lXxVu_UftJ-fIQo.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-ZgnsCqhmiiqHB_acB5uD8EOy4JKjS_YKjCFXkgCJjCw.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-9eSvX9vKyY9-wb4bSHZCbxKMY7gul9NNR05Rtb1f3_c.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-mxOlePuX480C70A1cVaafuiCmiKNti2TXFR9vJ7sh-g.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-LlIkenf1PN1-e8hFfKEFpYHG3YtYRcSvN1cF3PRWLPo.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-04Ncrr5USGAAFpDCs1cXkEPYeRZR6fd1B6OpBmQrGgU.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-fIjOl3ocv3itnNz3vu-7SBHgCs1Z6X803RpY63NYy24.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-_Q7TUYoWWNlP5OU7D4ntgzk-Q0kf3WhTKQg-FiGM1Zw.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-HgmNOw60HUtP72JWRjLXEQCX8d_R4FDaKQUElQ0mKa4.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-QzTS2VSheEObs7XvInNEUWCvuKIQZxYMYGpawlMzFU4.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-7gVf3ky9bVcH-E9bq-I0vE4AMbfXtnFyqIadA3wKm8E.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-X_ROMq2E7Yi7wYlAQzoqQFO3pKNeIbvXu5th_E8QBmI.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-RPg1Jjr3nY_X3_Gz00zcbG7Oxnlvle1mqlvjgV2RoxU.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-x3obvSpm3EWvWyrInsxPDnGbWXkzZiXBXFPW8YzhGt4.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-YgwCYTePlwZ_40542Ru9VJX_TXixyOoYbqjCQgW-DNI.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-y6I1qtbNA_0r8I8-VwKtjHKFK20RYIjasWByRxMPprk.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-cyVrFFtV6-9RrO4Yw5de91noU7Z_7sLJhGVfvyhXNGU.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-i_rYV1JDMnHlANn6OnKnaLpr-KuBy3NpLCHy5g9nAR8.jar
    Nov 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-tCENK_xkRcKIh-WujHuunvyiz-F5DzUytsCxfcRQ2Zo.jar
    Nov 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Nov 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96046 bytes, hash acc132236ee2e9250d193be0a9b019699f9a73515c910c03f95f86ef7088c6d9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-rMEyI27i6SUNGTvgqbAZaZ-ac1FckQwD-V-G73CIxtk.pb
    Nov 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 02, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-02_04_45_19-12476233594312830369?project=apache-beam-testing
    Nov 02, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-02_04_45_19-12476233594312830369
    Nov 02, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-02_04_45_19-12476233594312830369
    Nov 02, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-02T12:45:19.762Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:45:26.564Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:45:27.294Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:45:27.333Z: Expanding GroupByKey operations into optimizable parts.
    Nov 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:45:27.366Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:45:27.454Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:45:27.482Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:45:27.516Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:45:27.550Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:45:27.888Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:45:27.955Z: Starting 5 workers in us-central1-f...
    Nov 02, 2020 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:45:55.647Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Nov 02, 2020 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:45:55.716Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Nov 02, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:46:00.941Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 02, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:46:02.391Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 02, 2020 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:46:20.909Z: Workers have started successfully.
    Nov 02, 2020 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:46:20.945Z: Workers have started successfully.
    Nov 02, 2020 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:46:59.829Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 02, 2020 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:46:59.989Z: Cleaning up.
    Nov 02, 2020 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:47:00.072Z: Stopping worker pool...
    Nov 02, 2020 12:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:47:59.953Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 02, 2020 12:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T12:47:59.999Z: Worker pool stopped.
    Nov 02, 2020 12:48:05 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-02_04_45_19-12476233594312830369 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0ddb55da-f666-4c34-86f9-76d1dc5d2d43 and timestamp: 2020-11-02T12:48:05.386000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    18.284

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 02, 2020 12:48:05 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 59.25 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 48s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/guinslchueqma

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1189

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1189/display/redirect>

Changes:


------------------------------------------
[...truncated 271.47 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@103815614]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 02, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 02, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-o66XANp9rQGj2yq_IqJki2JHP5XDFPkaHtKhLsODmTQ.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-U4A8Jx4E5RCRD9zjMgUPP3VIXin-sEWyOgTawRhO3-w.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-fKetVEsB68xP18RsMYXvz1DzISlyknKHXsibQ9I4IQA.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-gADdfdi8dfed2-Zuvteo9p8AzwZ0XiqrBL73Mne-ZXI.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-P5BZn1UB7b8jdO2WpTK1jT0QbeIItXZBJK1kkAb1Dow.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-CS6lT0grRSApsVI_mInhTvSdn629UhpeZUCfXwvgO_o.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sVaUPtXI1yQqIFaSUu2SuHJGQEUbmOThrbFVAJkBqiY.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-MqFpSwC07ugkHJxmt0PatSS-BdO1B0-cY44gHZnxvPs.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-h91ft1GBhv71wKHzMAqVhFCGAVl1ah8-lwdtwTZpdJM.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6251670175469776155.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-PrPEiaoEDWLbQSdJ_W-J4ey6pOZ5SA6qdbNnEivTNrQ.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-JCkICHDorLkbci6kjjE8SSOmQfUapP-0cIALlDW8rUk.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-ui3aHKT5c86aiSEeBPCWA5CYyjkCYp5UcaLUGLsTSSQ.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-vo-b0oaR1jHuibZJwJHgMq_-G_LJWta8g_KXFh_ZApA.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-uzxH8bXPQ9qia182iULR6wLo2JTwyX-bZrHCCq1Hmjk.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-G8raUIFTO15f697o0KxIV7Q0z3TonzuYJl6QMGoTfv8.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-nozefTZqKz8msqXMoANYJq6sPEO7fjHGBY6FZ_SmLiA.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-wI7VIiPRzgB-DsJJbcy5VtCUCvgAsts0gmZbRciYrb4.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-RGy0pcPLablBLUesMeILfMRFUWLdJlbasa_w-NNfwgA.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-MRDaejfPhP5cB0mKEzBB6p_qq7MaGmQMBTQxvL1bZsw.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-ZtPYFB2twgqD72nCtgLqNkwr6MjqavCbYgm9TySJQaE.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-bzZ_V37BqqN2fFXvXBQQ6rxnvFimM2W6SPuaw3yaCsY.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-KqKwSWMjykXFODCa4OAU0mJ1o9dpxW-aQ_3l2--w01Y.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-tnnnRVeiK8F8nNoEG07iDrkOTxNmlM4P7sROCKDzJoI.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-GIGpvH6JC5aXwULLH-gXJnQ0wO3R5h-AH1mXqxX0Elo.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-CMDgnjxFNbTSeCkL068hhcMMIFq9x896ZJuH3IHZZig.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-pAKTrQ-DDW14BKg_hWlea7OTPgfvlG7QQF-uXSVn6K4.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-45wui47-ohUkiYlKzwSHosiNG2PItRNpHwDgi4tsPjA.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sVaUPtXI1yQqIFaSUu2SuHJGQEUbmOThrbFVAJkBqiY.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-LXcXTsD26RBO15lGn35ricDHLh8HpGm8UwG14iDyv88.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-tAVjl8e5nbAoXVH7NUBqAsbGCnmHOWlJk_fdOJun8y8.jar
    Nov 02, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-vVFZEAofBrEOVOfP-4acRirN1cd9LWhAwr8A2oSuYs8.jar
    Nov 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Nov 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 050fd825717f33811930c45e5d6ae2e0e504d24c4e192e92b2a31b004baca5c0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-BQ_YJXF_M4EZMMReXWri4OUE0kxOGS6SsqMbAEuspcA.pb
    Nov 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-01_22_45_16-4004597811428913065?project=apache-beam-testing
    Nov 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-01_22_45_16-4004597811428913065
    Nov 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-01_22_45_16-4004597811428913065
    Nov 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-02T06:45:17.028Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 02, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:45:25.679Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 02, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:45:26.268Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 02, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:45:26.308Z: Expanding GroupByKey operations into optimizable parts.
    Nov 02, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:45:26.335Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 02, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:45:26.588Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 02, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:45:26.616Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 02, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:45:26.650Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 02, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:45:26.684Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 02, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:45:27.143Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 02, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:45:27.226Z: Starting 5 workers in us-central1-f...
    Nov 02, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:45:37.995Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 02, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:45:51.773Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 02, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:46:10.723Z: Workers have started successfully.
    Nov 02, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:46:10.765Z: Workers have started successfully.
    Nov 02, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:46:45.241Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 02, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:46:45.598Z: Cleaning up.
    Nov 02, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:46:45.689Z: Stopping worker pool...
    Nov 02, 2020 6:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:47:36.685Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 02, 2020 6:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T06:47:36.742Z: Worker pool stopped.
    Nov 02, 2020 6:47:42 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-01_22_45_16-4004597811428913065 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6fc00e5d-a125-4b84-9485-a49dea660009 and timestamp: 2020-11-02T06:47:42.965000000Z:
                     Metric:                    Value:
                   read_time                    13.348
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 02, 2020 6:47:43 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 40.185 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 27s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/7yi2lanzpe77s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1188

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1188/display/redirect>

Changes:


------------------------------------------
[...truncated 271.48 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1795325009]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 02, 2020 12:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 02, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 02, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-WzmtjqJcv88D7m6Yr05O71jBD-M4JWYZjyae7Za3k0A.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-NnjJ5OWpKsdNuk2Ld-67vo_cD1pH88LP7VWec66XsHI.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-EDBol082EuOFAuWaKY2elxZ7I-VUUnBewcvkkanU0jI.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-uYc3aBAri46Mq3jFv3pdWklv9hJIZb_hsCZotZkYL9o.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-Xt_1gPHbN6uIt4-k243SBvr2OZpq3RehFUlvf0rm5ow.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-Fzg91lhGFbJ6E0U4E_u9yLfl6yi9mbVf_iBBUwCbarg.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-AXhXW72qkVbyD61fHmBMXKUw0o3K7lEhO6AP-DuJDq0.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test16902842258742888.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-c2L6HGhCxK-biiq2kTsHM6m4qgQiDCEWZ_D5bDSATXo.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-TZw13LVoN1ucLJk4syXlS9EgcxFaG533VGbVK9fp0vQ.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-_muCkS-Jib7CwiR5VYa4XjFCJHzFpII54h5LjRHiKI4.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-93jY2KALardNkSJ73BJkZwfe0cUth13Z5T3BOF8m8Sg.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-jw-DeHH9VV54lMiqv8zKsKOeFCSmDSuFFnmeoW4iWus.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-9PzumH4Y2O8u-sIL5Fc00XwspUokow8hDLNmmjp4VSA.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tWDW0udGrIR1lt26sY8WzSIcOX_h2U5VWw468k4vcgw.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-6sUa0yzBz9eGaqvk32Vc9qcvlnl07LUnCpSe2TRI9F0.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-K5P4wX1lvfpx4MrS4WQWQbWTqQzU5Me0HiJp4Wg92qM.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-YS4uAe-ITpJUu9dbLZPMKSyRLML5i-vbc_LGuvzPYwk.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-isrliBXaJ3pKL3jfYNvCWcrctDlCmz97ZKVxjNe8Fs0.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-WzmtjqJcv88D7m6Yr05O71jBD-M4JWYZjyae7Za3k0A.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-ubP0gYasf894FP0UefPGwdLIKoSPQHRsYbdzdVydeEQ.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-v0nD1GxNkuwMaul7zfOFFQLY6DiIFyuOVUJKpkYoogc.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-9Te6Ba37kwkR9N-PfLJLanaw0wv_x0WX72H1Camliog.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-49T_7ndM-2L1SHHXZVwbRcIazsDkAc_WcpEPpg2SfAo.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-zYJjQfPluVKcStVfVYFrYqWHvaKXdnZPiH5oOcfmSv4.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-YHpLznUqzX_1sMXjSfYBTf4cwh39doHwBqkFXHMUsXc.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-ixhEq0SbgbteSPwNWEfcNQ2KYLE12fgySFiIkX00xko.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-7UvdfzFTLpAJSvcYQKoKQLrl7nI8GgYDtCkfrTzqmlA.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-Ob3dT6aDW_tTy3iZeKlO2Lzhqmfk-HWubxJdzRUY90k.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-ZMbFYZB7WoLPD9elWtO_uQ9UFrsL6JZrLLKOtSa-7f4.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-izMzKCjukD5gwvXxKLFRevGv0l7SdDNXx_f4NomBWWI.jar
    Nov 02, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-GutqMnK3R6iGYwjue2jqXaoyycl9TsMWu-W64w8LQ9U.jar
    Nov 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Nov 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96044 bytes, hash ed56d243f9e93bd626b063ef7037946fe0e2e39dd5b073692083400c4d33dc2c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7VbSQ_npO9YmsGPvcDeUb-Di453VsHNpIINADE0z3Cw.pb
    Nov 02, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 02, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-01_16_45_25-7263691607296593136?project=apache-beam-testing
    Nov 02, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-01_16_45_25-7263691607296593136
    Nov 02, 2020 12:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-01_16_45_25-7263691607296593136
    Nov 02, 2020 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-02T00:45:25.034Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 02, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:45:32.222Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 02, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:45:32.791Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 02, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:45:32.831Z: Expanding GroupByKey operations into optimizable parts.
    Nov 02, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:45:32.873Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 02, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:45:32.957Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 02, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:45:32.985Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 02, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:45:33.017Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 02, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:45:33.043Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 02, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:45:33.402Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 02, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:45:33.483Z: Starting 5 workers in us-central1-f...
    Nov 02, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:45:55.933Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 02, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:45:57.956Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 02, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:46:16.645Z: Workers have started successfully.
    Nov 02, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:46:16.677Z: Workers have started successfully.
    Nov 02, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:46:49.180Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 02, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:46:49.342Z: Cleaning up.
    Nov 02, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:46:49.409Z: Stopping worker pool...
    Nov 02, 2020 12:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:47:34.392Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 02, 2020 12:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-02T00:47:34.436Z: Worker pool stopped.
    Nov 02, 2020 12:47:39 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-01_16_45_25-7263691607296593136 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2c766a01-980a-409d-9866-787e76250291 and timestamp: 2020-11-02T00:47:39.472000000Z:
                     Metric:                    Value:
                   read_time                    14.116
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 02, 2020 12:47:39 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 31.547 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 23s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/susnkf4oo2qw6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1187

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1187/display/redirect>

Changes:


------------------------------------------
[...truncated 271.60 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 01, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 01, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 01, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 01, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 01, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1847488743]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 01, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 01, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 01, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 01, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 01, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 01, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 01, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ehEvLdnGx4PvmMOfcH1kL7SEXXyvZ5fPHo6uMP9vvkE.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-NNc90X_R2jC03aA6o_sWbldPIG5Q9eJgVuXPSaiELyA.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-nxvu8gR6u4l7oAfUqF-ZyVij4QXn62x_RyEk3X2u1HU.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-6pRKPtxXPsYhVOExwOPWyo0TcvgIxf5EKfJYbaWAmYg.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-C3EcWdlT7Y6fPSZoThVmTmgGcA0RzrxUvVWflZM6bM4.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-L50KLZCXGfywBgOM5KcjFE06_11rlqMQ9HD6TVSkNXM.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-pLCsy_kn6SyFecZqKNeZoVnvEh9dWnX4MTFtKjoR6iU.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3752482386822794104.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-MhsHRYf_jHLekj3vboWal4kAOGyBhjUPcEBdb6_mh9E.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-Px_krnoVgYzmAfY9udbQN3X4iKhDrk8rdQYgbdQA6jQ.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-4x8SpL7UHqVcgn0VBNBk_3GlnDDEL8G9C7asldaoJ58.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-vfOyf_uKywL1FAN2z3lcILKzHI2DzEEslCHM715tII8.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-LTufyTkNPDCYrfRfNi0Z58Hq6GtlvhQZ-TcqOQf_IV0.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-3mHMKy6IuLpPupB_qNIfJXMmm72pc4lOyBXIm0dSB0U.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-2Q_mb-SYkHck3oNrNazrDp2Dh95xTSnRS0QxV1MUB_A.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-oAUEieQpyNbXRWcXPyNqOeKkyItTiuwKjrstR2b0zFk.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-jqL5sSw0tA9NIq-XkpoX2tPozoC242S8Y7hXUTgMnF0.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ehEvLdnGx4PvmMOfcH1kL7SEXXyvZ5fPHo6uMP9vvkE.jar
    Nov 01, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-x0SKl8kaCVNW9fIq6zJsCMNRLee2jz7tq3YlFfzNaa8.jar
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-ic3vieHoljt2JCK-Sg3fSXR-rrK4NJ3ZxibPtuRqnyg.jar
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Is_PeqACMQqULQkBaEtwQJAuqX9wGZ9LoijrVOqerRU.jar
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-_-nAR47kjPhRXNUO-UvfpVVsnOztI1_KBGJJ4asadtk.jar
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-qbjA_8v6lzh6pGMGyYzdxc77XRlhXDY_UN0d7e1PSmE.jar
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-lD1NLxBqbd-jcIYLTnb1AXR6KCOjomkPIWuUhylw63E.jar
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-2khzIbx6Z__Lhf7TNmQts2gJxyGMKFvVkNlKklFgrwo.jar
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-szReEuPwk50OkcnI4XBAFMNcDmvciYOLT9do25YET8U.jar
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-hEfq047JZUxL51sxahxoaweyDFcj8csklW3sa3i3Hjs.jar
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-LHgHUxYv1skiiWJNNic3Yv8CksRJ1NoMY1ahs37W42o.jar
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-h9vYjMjCH4mJXs-RAN2ObPYBHFzeQ3Vqsj1wVroq7lc.jar
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-yHvyvqM3Xn-TBB8yumtHrDGgLW4Z7RhADM58-sOxKD0.jar
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-eZmXuMKX_D5hUoJX_wq6vDBT6x7lgSf0iRhHAMgCXSU.jar
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-GLNx2LMYNC4zdgjzJ0SACQtaikohRgIZf28P4BwDknw.jar
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 01, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96046 bytes, hash 627810162276d958f94fe16a83d1f82e3d48845b04fa499bfbdd5d57f7b3d53d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-YngQFiJ22Vj5T-Fqg9H4Lj1IhFsE-kmb-91dV_ez1T0.pb
    Nov 01, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 01, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-01_10_45_17-7716126348544714422?project=apache-beam-testing
    Nov 01, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-01_10_45_17-7716126348544714422
    Nov 01, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-01_10_45_17-7716126348544714422
    Nov 01, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-01T18:45:17.184Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 01, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:45:26.289Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 01, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:45:26.821Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 01, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:45:26.860Z: Expanding GroupByKey operations into optimizable parts.
    Nov 01, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:45:26.886Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 01, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:45:26.952Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 01, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:45:26.987Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 01, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:45:27.038Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 01, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:45:27.069Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 01, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:45:27.440Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 01, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:45:27.508Z: Starting 5 workers in us-central1-f...
    Nov 01, 2020 6:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:45:47.829Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 01, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:45:51.011Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Nov 01, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:45:51.057Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Nov 01, 2020 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:46:01.572Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 01, 2020 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:46:10.471Z: Workers have started successfully.
    Nov 01, 2020 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:46:10.500Z: Workers have started successfully.
    Nov 01, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:46:48.188Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 01, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:46:48.340Z: Cleaning up.
    Nov 01, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:46:48.416Z: Stopping worker pool...
    Nov 01, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:47:49.930Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 01, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T18:47:49.976Z: Worker pool stopped.
    Nov 01, 2020 6:47:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-01_10_45_17-7716126348544714422 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0eed84cb-045c-4234-8abc-11bc24e8a467 and timestamp: 2020-11-01T18:47:55.083000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     18.55

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 01, 2020 6:47:55 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 52.17 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 40s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/frnnz2nm2fzes

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1186

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1186/display/redirect>

Changes:


------------------------------------------
[...truncated 271.14 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 01, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 01, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 01, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 01, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 01, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@103815614]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 01, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 01, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 01, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 01, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 01, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 01, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 01, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Hxy6uWhsX_Z0cyuzXfcVpH6uBPNI_lEc8Ozds3qsWIU.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-R8gniys2Axbhgfpl-mDWCcXzjbqiAsDrEt9OpD7-duM.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-OxZTy8zNOHPwQBGZQx_7bZ94BLLzPGpRJL5cLF3t5jA.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-TmJhNhXR3XqKgeHqPMdycNFqyliqg4kwsGnbgawbkmY.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-OE4BExDdzPRB6EBPnMtGXj8WpPp4UkWR6VVQvalLWBI.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-caQoAndzjPTbeV8KkO8RRme8TJzig_PDP23KmmMopxE.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-mXF9rL4iPB5UBAP2nLE7OdVjEFcHNgOYJ5e2St4lbbQ.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-P1NA4DxOtO0v2fDXSUHzQTYtglhyylG290zFxOrD3bM.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9117465170216586939.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-gR6FSTw9SRWACMFFSWGRB4prw7S3PQ9tF4NrX0GeAW0.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-GfSL1Ujgh7Iixqk-6FfvM8DoZKtGsjNavyUNvBpWiL8.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-BySWhLSQ5eVLmdEBIrfeDPsGOpchlmKtH49JTsAh2-E.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-gL_TocjDvtQxQpor5-woUrkm9AoUaIVwkgVx9FsdWCE.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-7KmeI8a8OByAyF_ky5kH6Zk24eFSjUeviYj9a6Bwee8.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-l3tz-THSzmtdVMC83qOUOLpg_SKB9cukA2HDu7HRIP0.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-fEYYukrfOpOkUM64XUIYGIlZub_LmYIF47KXeYxXRr4.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-uT9ZGvka_TNlgZt9mkHjxgKTb1iRZXtvZj7hhclLRaA.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Hxy6uWhsX_Z0cyuzXfcVpH6uBPNI_lEc8Ozds3qsWIU.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-pYGddV37cUZ-m3ZW3IsoPmbDLC-rWm2rfWJZH1eitY4.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-SAWXxu_jv_xRQkm1wIz7lv0NtT_NOPH0v6TW_Cdimkc.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-opFLHyeQH1KnRTzzgn7mzhEjllOYPrC-FYlOK-Z8iOA.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-tanz6_zjNNm_iBHBQUsPy4rNafK-TjRFF_5ldQ5283g.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-9NZrYMCw8CPK3wSAvfv4JztzWPzi4Ba2ARjFp4uP_9s.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-8FtkHVMtCZ5Z-SmEEju7Pv6a6KMy00A-30tAA9Xb7JI.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-fmUxUX0dpACA_xpMVSxPcVBOpJtkBHRhHa352RTFpoY.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-n8rD3u-Cm0WJyUACh9160rrOsnughxK8dHb8yPXKBzc.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-MZtbeQkeDVETYTuqO129oRfuZogby7OsK8NXOkI2qjg.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-Ysz86EosEtFQyKTX_zYODuXS4o9cdsR8cAq4DSl6oZc.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-a-RpTWPQLJSqW7f5vX2Dk9h4E8fO0R1hY499jEq-p5c.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-gTTDoaZzQ9_B6HnczNU5nSxyvjQrOs5VRI1KdHGPvhM.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-qJMONWsdld3QmNByoRtHhBpn_q_Cvi9rPw_pbHmo7tM.jar
    Nov 01, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-82K26ifHEvXym_bs54QkzIIBEzFARWaRIaVlkdK-m_k.jar
    Nov 01, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Nov 01, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 01, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 01, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 01, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 01, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 01, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 48cd5dad0737e66bb3ae68eeed1326d6272ac82b92037aa960305ed8c4dc3f9c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-SM1drQc35muzrmju7RMm1icqyCuSA3qpYDBe2MTcP5w.pb
    Nov 01, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-01_04_45_18-4541770337140008280?project=apache-beam-testing
    Nov 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-11-01_04_45_18-4541770337140008280
    Nov 01, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-11-01_04_45_18-4541770337140008280
    Nov 01, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-01T12:45:18.745Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 01, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:45:29.145Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 01, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:45:29.703Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 01, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:45:29.744Z: Expanding GroupByKey operations into optimizable parts.
    Nov 01, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:45:29.770Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 01, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:45:29.838Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 01, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:45:29.868Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 01, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:45:29.901Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 01, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:45:29.936Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 01, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:45:30.392Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 01, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:45:30.460Z: Starting 5 workers in us-central1-f...
    Nov 01, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:45:48.502Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 01, 2020 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:46:03.317Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Nov 01, 2020 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:46:03.345Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Nov 01, 2020 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:46:08.581Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 01, 2020 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:46:29.456Z: Workers have started successfully.
    Nov 01, 2020 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:46:29.486Z: Workers have started successfully.
    Nov 01, 2020 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:47:03.611Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 01, 2020 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:47:03.769Z: Cleaning up.
    Nov 01, 2020 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:47:03.895Z: Stopping worker pool...
    Nov 01, 2020 12:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:47:57.995Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 01, 2020 12:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T12:47:58.043Z: Worker pool stopped.
    Nov 01, 2020 12:48:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-11-01_04_45_18-4541770337140008280 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ede700c2-8521-4df7-a533-f93f18a691e7 and timestamp: 2020-11-01T12:48:04.108000000Z:
                     Metric:                    Value:
                   read_time                    14.107
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 01, 2020 12:48:04 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 59.309 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 47s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/ufwwrr7olf5wi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1185

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1185/display/redirect>

Changes:


------------------------------------------
[...truncated 272.77 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@103815614]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 01, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-BLxswk2yjbq1yvQxRTS_1_AIs4PJXlMv2kPxt0FXUtA.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-UtEW9lrdHzLVVmXMsnB8j8gqJtx6U_cz77A7r2YYHsY.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-y2NZjUNqSqCObBLnUfEAMSR4Rgl4-BThQdjIPJkkxqQ.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-kyK0F9_4lEKcISXXBLHOpGj7JwqSPOieEO0mRoS2eXk.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2187443951758089137.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-g1c6TYN1N8Od8_aTqcfCcs2lk4ehiypWliz6G6lZsK4.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-DHiH4I4N_QcruI10-rcDW2IvmRYg-IfhF7sEyYqWA5w.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-dzfXBiDWp53UKtfvIn3JjUbGrYvaxc8wUw_Ptp3XzbE.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-CvxinApUmtu-98xM3dnXM2xUnPNjaxxKtAsjMlHgI9k.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-pf3Q6wAQ43cla0czZG8-QpltVETI-Zk4gujbb5flrB8.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-zlfl-FeYSqoBjoAu6l5Jj5ByY_i7fxZWY_gyduHWeyE.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-DNUJCzTfMFjn6Qy2hLNell7wG0ApFrbDCRfm10OUfZQ.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-maymthgx_xrYO4rOXsMDAXVH0o9VNVtP1IzOt_C6SpM.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-TylP19opY_rassm_RbF3cmIhxrDlFb2NNYUPo7hgS9c.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-ooB57O0OR0BtAGHp2nwEK2iJoRNWoOSlkzUl1T7Y_00.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-7jWvOnzEX9CZ-amEk7IRgIcjWLl52WBycw3IysE7IJs.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-yFCIENvCF6mLJDYTtuM723w9AJopery5ai-FoUoACiI.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-BLxswk2yjbq1yvQxRTS_1_AIs4PJXlMv2kPxt0FXUtA.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-pDnK4XcVv_XFduiE4kUHS4CtMKglpYRF5G8KGRjR_68.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-k5cqLSU11BhzG5IJ599Axepnpj_53p4mv7XlfwHtTSE.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-TeDQZdh2IfPfkneIJnVwMOMr8hnj9sp5lAtvAHONR_s.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-tukP-dE3l8q_gZGa2PwqV0rgqFfoAI182hyHK-nvP1Q.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-7H3opnLrDBNZPdhidmhnGrAuVP48A_WzvWejQ9HvT9A.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-IZMIKyOrOZazrspMScrD_FzDSR0vjhmwFDclGEynx7M.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-Xsclb88Dk6LrR2k6unuhQut_2eC9KVfYwJQ9dN3epfg.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-AnVLtQqrM8WaKKnEw6Aj0PH6EtgrhIzceHLgehxJqtM.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-OIp1CAtIub9olsb-vpYwnyXstFekNEBQdrAKgnEampQ.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-ZRIOYz80OvN8v8BRDsYWkptfmtc9sbEM-uhoFEseJOY.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-tNZ7TQ9aMy64QDaQdygRazWp2xEWk5S5yHWW6xvbmYs.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-hp-VqGjGaag4PRKnPXDL9Mxuwp4iU1VlylUU6FWVHzo.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-k4o9T4zfT4bhLdvFmhCovHZtPiNIpver6Dfp6P8d8Vs.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-VUp9KEE6-goDlkvdMISCX3Fi9t3wvibkLaLqmCqNrlI.jar
    Nov 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Nov 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 5057f2bffbd53b3cec2beed43cb8fe4db0e967d1df94dd3ef0118bf64ff014eb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-UFfyv_vVOzzsK-7UPLj-TbDpZ9HflN0-8BGL9k_wFOs.pb
    Nov 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 01, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-31_23_45_17-12492350751371901538?project=apache-beam-testing
    Nov 01, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-31_23_45_17-12492350751371901538
    Nov 01, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-31_23_45_17-12492350751371901538
    Nov 01, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-01T06:45:17.508Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 01, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:45:25.110Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 01, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:45:25.674Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 01, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:45:25.706Z: Expanding GroupByKey operations into optimizable parts.
    Nov 01, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:45:25.739Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 01, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:45:25.819Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 01, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:45:25.847Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 01, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:45:25.880Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 01, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:45:25.917Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 01, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:45:26.439Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 01, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:45:26.523Z: Starting 5 workers in us-central1-f...
    Nov 01, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:45:32.643Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 01, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:45:51.438Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 01, 2020 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:46:09.822Z: Workers have started successfully.
    Nov 01, 2020 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:46:09.854Z: Workers have started successfully.
    Nov 01, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:46:43.823Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 01, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:46:43.962Z: Cleaning up.
    Nov 01, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:46:44.033Z: Stopping worker pool...
    Nov 01, 2020 6:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:47:26.677Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 01, 2020 6:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T06:47:26.715Z: Worker pool stopped.
    Nov 01, 2020 6:47:32 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-31_23_45_17-12492350751371901538 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 845ae05f-9297-4bf2-8adf-fbcd7d30b7ba and timestamp: 2020-11-01T06:47:32.562000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.568

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 01, 2020 6:47:33 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 29.167 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 16s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/62woh6h6belwu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1184

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1184/display/redirect>

Changes:


------------------------------------------
[...truncated 270.83 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Nov 01, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 01, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 01, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Nov 01, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 01, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1847488743]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 01, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 01, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Nov 01, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Nov 01, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Nov 01, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Nov 01, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Nov 01, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Nov 01, 2020 12:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-NeJmYrMBPicFhtEZcozxncgRrZsumFt0VFnSw6HTOZc.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-2E2015f5PNh7RfIpb3m3Yj-dsB_4sQqEQQjgVUsbtHs.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-IR7EkpDK4bACApwkL87LUVhjHzQ7CJOjKXOanS1k5FM.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-TwrPyy1nM_O_di-ZniXQK9I-EodJOxiYN7uYV-TAw8s.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-7CxSg3KPocSYCz6fKJsooJLSSus-GgU_yey7QMQPZwE.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-skHwpJHSGOf6_tG_j0RJzxSUArvF6UejE6BlmTesEfY.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-YatOH26QwLY3tDBtGUnxfbu0sdP1SWAEqUCMtnD_FTE.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test678778599758374413.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-1d2awPT1rdXGnQ5IPMnG-DBuNoHX1ddp9i48kUyJfaI.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-pzzHfAMA7tf0n9pqpSFZTbD90QQvsQktmdHTSheTzn0.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-uRGQgKbx7HM-ImglwyPDE3zSWhVOXrW4xL48kZtDAuM.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-KE6YQPR8HJK_rrx96HKVhT33hLpac0wevfjdAPuE7aE.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-pOtVMpR7CGE9QCfBX6LlO0avr8RzqgzQzwYtF-aPVv4.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-68BgiqsKmsuH2K0DdlRPrP-geGnaptdSaNOkt6bUiW0.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-yTamSGHAJMhOzblH-la-lbR5Ig-N8ObTv5KlUgLYblo.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-qTDic_bC4wBJ24-0jUpjxTremUSyMGbG9l2Ohy-SveI.jar
    Nov 01, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-w1AMn_aYtKONWs6nlO4Q_L0GRtanYbKfkhucS7YrVLc.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-2E2015f5PNh7RfIpb3m3Yj-dsB_4sQqEQQjgVUsbtHs.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT--OYsr69vSVpDUzfwz39erBKM3si5qT5i-pSiXCOBeDU.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-TGoBAMo9LR25YVHX_ANnseA162w1OqcUWie8iW0dwpI.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-RqhIqT0mKM6CWYw_KEt-1tXJ31K4dOwYb09oZk8gwUk.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-90jreZNzokQLGGS9zm15OVgQ6I6AffsqHdg3y8pGQww.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-xkLeKDLPgcXxSd2njaPYTtdbJncBk0BMiqSwcxewZuI.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-zM_o97vy5HYJCvCQQ-ZejsoJJH2jgs_5id86Pd7CLA4.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-FQfLEQ0lRMRpCeBRNXNeN7vy8iML8s_-nH6CEfXAGBk.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-hcadbmZApVLaqzMr9KU5Nq3mcVh4IVlCLknCaNgXfRE.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-Qv48yOjn8zlF5iI8ECIJk0mszOp2jDWe7h5Y2RG6D9o.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-L2FzWcyJxjerbSYP_WWqdXKHmsNrsSARg6WNVIyngj8.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-ojuIYa-QFROwNljf67o8i15he4hElNd00l_eUTCgMWg.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-c5hBQoUHs7Vj4ik0THYvRqKB5Ql_m6IEaFIQxuN82e4.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-qS9DvS-IPDgJZ0YCys1-ho66MLnQpv1nnFHKnSGw1-U.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-U-tfKd9Zaq5Apm-JRb5LMBpgbxQeZkDkVMv05RG_rkc.jar
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Nov 01, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Nov 01, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Nov 01, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96044 bytes, hash 767e194562aa902273a332c4fbee844263016e0581edbc9eef005ec1e53d182c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-dn4ZRWKqkCJzozLE--6EQmMBbgWB7bye7wBeweU9GCw.pb
    Nov 01, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Nov 01, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-31_17_45_19-11866490078306036238?project=apache-beam-testing
    Nov 01, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-31_17_45_19-11866490078306036238
    Nov 01, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-31_17_45_19-11866490078306036238
    Nov 01, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-11-01T00:45:19.373Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Nov 01, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:45:27.104Z: Worker configuration: n1-standard-1 in us-central1-f.
    Nov 01, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:45:27.640Z: Expanding CoGroupByKey operations into optimizable parts.
    Nov 01, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:45:27.676Z: Expanding GroupByKey operations into optimizable parts.
    Nov 01, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:45:27.704Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Nov 01, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:45:27.770Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Nov 01, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:45:27.804Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Nov 01, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:45:27.838Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Nov 01, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:45:27.872Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Nov 01, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:45:28.241Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 01, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:45:28.321Z: Starting 5 workers in us-central1-f...
    Nov 01, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:45:44.952Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Nov 01, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:45:57.280Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Nov 01, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:46:18.607Z: Workers have started successfully.
    Nov 01, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:46:18.647Z: Workers have started successfully.
    Nov 01, 2020 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:46:52.789Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Nov 01, 2020 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:46:52.982Z: Cleaning up.
    Nov 01, 2020 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:46:53.060Z: Stopping worker pool...
    Nov 01, 2020 12:47:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:47:40.323Z: Autoscaling: Resized worker pool from 5 to 0.
    Nov 01, 2020 12:47:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-11-01T00:47:40.374Z: Worker pool stopped.
    Nov 01, 2020 12:47:46 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-31_17_45_19-11866490078306036238 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 52092285-4a39-4562-b45c-bde88986cab7 and timestamp: 2020-11-01T00:47:46.825000000Z:
                     Metric:                    Value:
                   read_time                     14.63
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Nov 01, 2020 12:47:47 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 41.55 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 30s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/5n5ut7wcla3ou

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1183

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1183/display/redirect>

Changes:


------------------------------------------
[...truncated 270.65 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 31, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 31, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 31, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 31, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 31, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@103815614]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 31, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 31, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 31, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 31, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 31, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 31, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 31, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-FXYOrdXPCh_C2xx0IZ3jcsbxGw3ZRofIXwqmJhdIiTg.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-mVTweo857X0X0ks6raDTnE_I3uM2261D51y36VZtgjw.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-JqYBUq9S0asozd1KQb5cZjK1kvomFPv1vMdv9vpqoM4.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-KlGKgYmpkAZF0Sm-uwG2VMpsjxHOZU8JTOPfT4tUR58.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-s5RwndJ2GQRUN8MckNAa8r5ZRJJ8s0rMk1KPkUb3GRo.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-gPckBzTabC0A5KFp8jIzLvsgFFY42PUuH6zGTuigTaI.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-bsmbDCvXxZa0FXogc7Ut6HByV2cf1cYktyWaW-WDwjI.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-MhApJTubltXV7d2uFpLKuSdJoQC1dG6VTAQrDFIzhoQ.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-GpnHIiulywxjHYrWHZoqqjOXFOvHcbf_WBLc0wRsnrM.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-u6y3M30s34yj-izylURrMv17JIOJSzRIuQwh4JRbLP4.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-Q1hJW9ng3VKnhFtEv8eRS34p8rBDjueM6WJER4HBa-U.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-zkG3W91E7dAWyvXwX61rHjegUV9hM6mfWnxQ1cAoXew.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-InzjXZKAfn29Ud1yIhpVbJKqskUttRVQLZ2cBzlh0tU.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-9f-2cTYJgLhFCnieqNxWOZERwWh7xkj2t6HazXettcA.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5840828239117439726.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-c9m_6YFGtzjsQGxtNqeof-Eova15selVz70ueVG3LAc.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-b-rei-V5M7D4tx9IANaFZJNCHdmGo_ypfG57Tp223iI.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-FXYOrdXPCh_C2xx0IZ3jcsbxGw3ZRofIXwqmJhdIiTg.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-DMTekNmCR0dggdYDT7DfBkJNgsEKCWlPyfQZcUz9znc.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-KQ4deDzCKb5DDBW-csak0mHJqNqIi9ZWtfr6648azTU.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-xoZNKgxb2fpLJUq-uspg5wjxXk237X1u6nPvvymjXug.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-cWK08OnSc33TxBgfOuoqvkp9k-Jn0TrotUaMeB12s9c.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-H5kLjyhTBYnjjsPjsrikrWYMSgc1SqniYU1JBGZKIPY.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-BJWcDcY4QLiAS53JWBk3fE9uGqrifziNzMwjzSNW1LQ.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-vF9s0Ymw7Nu7Ng9YFNkCUXez9RzB_7fYZCMEZ_44bQs.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-KJoP_5EwsZh-i7-uuHp-OxN7P63DABCfPGFC4gjDz7w.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-xowhxq8ThoLiDuB5JjanbkKHXqjxGl53B62yfGRnQpg.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-XBWApMzOavZNSTeANj2CekcKF705X-ALCo53fTfkeEQ.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-TP_CL2Z074ygB59yAVoAgKqS_K2c1UDnU3l1SIRew8c.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-FPI-uTRM3bxvt6FirVGMDJbbCwiRD_nimbgOKOrBZgs.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-WBBqhC4wzoblf2T574t0MJSMGMzdYM5UO9ihq27_ZV8.jar
    Oct 31, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-h8aCNtMM3FR6tmfE81MZ5Y2fvvR2GaFme29ECzriuWE.jar
    Oct 31, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 31, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 31, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 31, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 31, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 31, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 31, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 43a2ca430b6b5b895c0492020fcba6aa52f9adab5f79877bbb321fd054ec9eef> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Q6LKQwtrW4lcBJICD8umqlL5ratfeYd7uzIf0FTsnu8.pb
    Oct 31, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 31, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-31_11_45_27-12141602894371344096?project=apache-beam-testing
    Oct 31, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-31_11_45_27-12141602894371344096
    Oct 31, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-31_11_45_27-12141602894371344096
    Oct 31, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-31T18:45:27.223Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 31, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:45:36.108Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 31, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:45:36.768Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 31, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:45:36.803Z: Expanding GroupByKey operations into optimizable parts.
    Oct 31, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:45:36.834Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 31, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:45:36.909Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 31, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:45:36.938Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 31, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:45:36.969Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 31, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:45:37.001Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 31, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:45:37.447Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 31, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:45:37.530Z: Starting 5 workers in us-central1-f...
    Oct 31, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:46:05.137Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 31, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:46:05.305Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 31, 2020 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:46:24.752Z: Workers have started successfully.
    Oct 31, 2020 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:46:24.788Z: Workers have started successfully.
    Oct 31, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:47:01.050Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 31, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:47:01.218Z: Cleaning up.
    Oct 31, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:47:01.293Z: Stopping worker pool...
    Oct 31, 2020 6:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:47:51.855Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 31, 2020 6:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T18:47:51.903Z: Worker pool stopped.
    Oct 31, 2020 6:47:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-31_11_45_27-12141602894371344096 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d3cdcee2-f408-47f2-bea6-6eccfc1f2399 and timestamp: 2020-10-31T18:47:57.623000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.073

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 31, 2020 6:47:58 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 46.37 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 40s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/pf2e6ra7sgxvu

Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing files not accessed on or after Sat Oct 24 18:44:24 UTC 2020.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 0.115 secs.
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1182

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1182/display/redirect>

Changes:


------------------------------------------
[...truncated 269.76 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@103815614]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 31, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 31, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 31, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-9TL00PwgzEZsapo54qOQshFOX8qf_d05U8TUtRYrnoA.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-QV2pEfE7uO2pmOmJWoowuaw2kdVA3-4AF0WAy5QtB4E.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-SEaBLKUP6qbSNb6TbPRV4X3lXDYNnn6cMyQiJ4XQEoc.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-_DWyZxUODuslLrl7YIVK9BKk6WiUzUujGclTubMVhLM.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-vdwhMdCtA262LDmNfcNq4PpB4W9fgVkS2d-_FqsYGI8.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-N_pAX1mwRHcF6hR3Hfxpw2lqnL4bNRCI4JYo0qpOT-0.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-XXrb5SmoRLvvHWpqF1tSwFVb70wiMGppOB0y7_1HhX4.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-6QvO7kQAshd9PbiVStDV1dyGvYwb87QwUGtksqezp4k.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7658871777980938029.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-9Vvt97luCFeR79wTYqClJsMk46VzBPNXjC_fPkP8fKA.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-ublbHbRRSGdTtYd4ywZRK09XW30VnohA4P3UTc4TcNY.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-L69ruK0-nz8ToJS7QUK8ujyjl_rkwfsWe9fZKSyRmOo.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-wyqqW4wuUUL1L4krhqAAWOqbNS8_2NNz7uzBIY7BeHI.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-k8dCr6Y2_VYFzGRgZqk0hWls2sO4xZMQDvkF7gGWDH8.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-ZxVCkwEppeit4oRJc8lRKf_5KThnLvTbLQxYgTu4Y5o.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-1chsYr4oYPWwlhXxaEi-8KuxOTNMkJQlxBpHcf4DXPE.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-znOF5J8en8VSO1JPBjf7U4nySQ_U-O7mEOv3ZoRashs.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-9TL00PwgzEZsapo54qOQshFOX8qf_d05U8TUtRYrnoA.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-0VOAeCbg5bC0gaETfGvA6HyddrWmorMayGKlxE6D-Mk.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-rOyz46iKYwDXJWnLRwFqSIT8X86zbJi8I5K3QLSE2wQ.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-iDkWYC7YAJ9L8mvWBRV2Uwt89n40K9NUyikgSVypNWY.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-MZ1pe8GIUHHDCGU23BhQXxXmBbFHaTrq3dOz65dqusU.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-_enuxEws1M0Is2y97oYV-WIZPrfetro8WxjMq6UURsY.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-wJ7JqBNPfaE9GD3gRiTxixSoszi4xRHx3sdbg2wCU5w.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-mbxVj9tdXeFUUuN37O4rEZWa7M5GEge6sVhLBZC1tnY.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-HuDQ4TaNX_AqvZov1Ma3KG3oTPN6eUAKPEOWDF4IIFk.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-mE3VcdCzjBf8wVGhbqa1GsIKAa1Mm8eXeOGVzWdtwPw.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-EtSkG1rPOD_OOnCK_Jkb_PtklcvlXUowN9vJAC9Zybk.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-E8tpgTLtKsvC-bNthkIRX-rWjGkEyGiuBFlKCddI5PI.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-t6aGJtE8a7zuMOrdQIopnkg0LCktzseMqLg8BUW0YOg.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-GDGmYDtrWSK6j8b9UU2HOBmgV1uqniTttN3IvYMju2A.jar
    Oct 31, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-0IM9ZrVUK6SeucX7c7Olmje1KmV3UKnCfu0m2PzIWXY.jar
    Oct 31, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 31, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 31, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 31, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 31, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 31, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 31, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96046 bytes, hash ed803c5e341c787e72a77e03276e5fe1fae6cc9b172e85e3832d1ff0fd41f20c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7YA8XjQceH5yp34DJ25f4frmzJsXLoXjgy0f8P1B8gw.pb
    Oct 31, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 31, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-31_05_45_21-6703902326119062906?project=apache-beam-testing
    Oct 31, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-31_05_45_21-6703902326119062906
    Oct 31, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-31_05_45_21-6703902326119062906
    Oct 31, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-31T12:45:21.794Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 31, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:45:29.244Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:45:29.817Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:45:29.858Z: Expanding GroupByKey operations into optimizable parts.
    Oct 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:45:29.890Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:45:29.978Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:45:30.005Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:45:30.036Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:45:30.071Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:45:30.570Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 31, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:45:30.649Z: Starting 5 workers in us-central1-f...
    Oct 31, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:45:36.139Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 31, 2020 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:45:58.338Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 31, 2020 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:46:16.751Z: Workers have started successfully.
    Oct 31, 2020 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:46:16.781Z: Workers have started successfully.
    Oct 31, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:46:53.138Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 31, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:46:53.305Z: Cleaning up.
    Oct 31, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:46:53.381Z: Stopping worker pool...
    Oct 31, 2020 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:47:46.442Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 31, 2020 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T12:47:46.502Z: Worker pool stopped.
    Oct 31, 2020 12:47:52 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-31_05_45_21-6703902326119062906 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): bb7b2f77-89d7-4c5b-ae3d-7524dce97830 and timestamp: 2020-10-31T12:47:52.456000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.553

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 31, 2020 12:47:52 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 45.823 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 35s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/okludgixugeb4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1181

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1181/display/redirect>

Changes:


------------------------------------------
[...truncated 282.44 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 31, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 31, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 31, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 31, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 31, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@103815614]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 31, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 31, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 31, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 31, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 31, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 31, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 31, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-U4N_Vtfr4zkgv8wPPF_aIo5KDD5LduX4erMf7k5lt7M.jar
    Oct 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-0FIWss7NCfMRKanjUTCJUCwIormXzMHTPokFmHUhNh0.jar
    Oct 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-kptjDYbMUdAqkcUXSsgX7_50XptHrEH8TaQMt3DVX4I.jar
    Oct 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Lg-dhJgeD3d-rw8nMtXwGrBsbPOEqAbpzWRXSGsuE6Q.jar
    Oct 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-4BvnbxAiUve94l9OylC2HNIRPyuT8m6nIBD6YbEEox0.jar
    Oct 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-3rTgtnULg90h5jFBRBADC7QJR-3aP3d1HdLDCZpnbyQ.jar
    Oct 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-3Uiv5YD2v73JtxeEK8IR88rlD3Ko0RBjq2q-XpJc_qQ.jar
    Oct 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-UE2m2oLzfWp4lVrznp9kh96Ws9W-INC3ptb7Tt4tkfY.jar
    Oct 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-pfm5dNiNiWBsNPaMhVHDdYHF3wp41deYG4xrI1RNm3w.jar
    Oct 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4652081169298044190.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-mWSEGhqCRA9Goez-8qcG42Dl6CCWfle9ohTkO9zirow.jar
    Oct 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-RWQDPIu47YPFnaI0f001_L75ZDTBuDqxQsdzz6gmgG0.jar
    Oct 31, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-iZQXTxxLAUDzGeRi8x2n3Oq2Xp2ZUWIJMNqsNKAfUTQ.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-5Cr8I2rWhPhKfsFGYe5tPZyvskvdnCa26z9kZ1be6Tg.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-aSIQxkSxob3VM8wU0m6E1Kv3vaZERuXPlYNF1de8l4s.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-frCVpPMIQSIc2n0lBI3aJm19vHvhuiLYBNg55rfpMv4.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-FL_tqJzgF8_2HFrfTiDE6uYNRwCUSWyjRWkwKXKkO9w.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-U4N_Vtfr4zkgv8wPPF_aIo5KDD5LduX4erMf7k5lt7M.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-P9pEharVPlVavjxJWllkEO5LeOplCl6F02-mt8KM4ng.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-dnVUjeBYXt_UE-9_GM9cGipYmtWkqVaovtoXonyMVnE.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-eNdsSAIxe05aD4K6zyYbfU5krtX4qLxtjQ0l8ob71N4.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-snVKTn8sM_KDeUTgmIdbnwHKzEZvOl5PtK-VuabwayE.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-E6L7tjyU4KRE40dEHU8AArxpQJzHShDWKAQ5e8eNw28.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-Fn4Zgn5XNPcDu-WO6p-neg3o77U1hRo1fNaBbtC-IOY.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-jzQwMmlExjqTCloWCm8I6tVwuNxcywc5UidYhwhMFeo.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-uG6oFuY6wyYYQCLh-eYVDlEDTdxygbE5ZRJoH6bt-P8.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-uBizJRY_MO1l20LVd08uND-I7srzGj-uhku4h-Dfcqg.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-u1D7WYLLX6Qgomq9Kg0EqSHnL_VrAmqvTjr9AfQhq68.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-8swBmlBGCq50du-U5glvTidLQ4OMq9dNA8tkSAAZB3Q.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-QtSZ1IOUQ5hpPb5RZ9gMfvnuiuXoZ2wher3I5mw5f04.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-1T2Z1Tb5BIaA2rs-scx2tz0RohD_ttGc411ax5X7OQ0.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-9OauFq4owlL24XX3qaNxM5NkJ09VtEcOvZoA45D3kuU.jar
    Oct 31, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 31, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 31, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 31, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 31, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 31, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 31, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 25deee47a5f8b3cc7433e3ef6472e3df06276bddefb4400a07270958404fa9f0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Jd7uR6X4s8x0M-PvZHLj3wYna93vtEAKBycJWEBPqfA.pb
    Oct 31, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 31, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-30_23_45_40-10355760484440574731?project=apache-beam-testing
    Oct 31, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-30_23_45_40-10355760484440574731
    Oct 31, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-30_23_45_40-10355760484440574731
    Oct 31, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-31T06:45:40.725Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 31, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:45:49.080Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 31, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:45:49.714Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 31, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:45:49.756Z: Expanding GroupByKey operations into optimizable parts.
    Oct 31, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:45:49.797Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 31, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:45:50.031Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 31, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:45:50.091Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 31, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:45:50.116Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 31, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:45:50.151Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 31, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:45:50.606Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 31, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:45:50.673Z: Starting 5 workers in us-central1-f...
    Oct 31, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:46:14.057Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 31, 2020 6:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:46:18.256Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 31, 2020 6:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:46:18.287Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 31, 2020 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:46:23.613Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 31, 2020 6:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:46:38.348Z: Workers have started successfully.
    Oct 31, 2020 6:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:46:38.388Z: Workers have started successfully.
    Oct 31, 2020 6:47:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:47:11.053Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 31, 2020 6:47:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:47:11.237Z: Cleaning up.
    Oct 31, 2020 6:47:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:47:11.334Z: Stopping worker pool...
    Oct 31, 2020 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:48:09.025Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 31, 2020 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T06:48:09.075Z: Worker pool stopped.
    Oct 31, 2020 6:48:13 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-30_23_45_40-10355760484440574731 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 623b611c-83cc-42af-ba17-b3973ea78b53 and timestamp: 2020-10-31T06:48:14.003000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.608

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 31, 2020 6:48:14 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 47.51 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 59s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/q4shitib4xdbs

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1180

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1180/display/redirect?page=changes>

Changes:

[veblush] Upgrade gax-java 1.60.0 and grpc 1.32.2

[veblush] Hardening ProtoDynamicMessageSchemaTest


------------------------------------------
[...truncated 281.64 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 31, 2020 12:45:46 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 31, 2020 12:45:46 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 31, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 31, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 31, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1847488743]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 31, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 31, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 31, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 31, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 31, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 31, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 31, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 31, 2020 12:45:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Yd-eepn8LL7Ak7GvekwMWYOTi_w186uxprTvBA2qcRY.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-IzMrPpNbUpqMo9Enxr_gvXbx9qW5PmZLNRjIzBW4gBE.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-07GnByWW2kDtKvDXAaKLH79LIq7PrmNJWVMrhmXDCPI.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Ok002lD6v0Y7hlxWGh80EkNbDOlCqRqqbRwDxnJr4G0.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-CLQMMlVDrJkFtlMpElm1npA5P0cPe38sXEIZMb4q3no.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-sLKyCqeMk84I17rngD1vTK6tuLuIJmox8_jyr2XsADQ.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-WM4FPqNh_34wZybIYq5mZBFHIH4ksUVWrWINBP5Lcbw.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-7tYNwMCkDUqTDfq8tLmZRg__GgkzcLFWNP-ixe1D0Cg.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-PhAsVgMvqy5sDUAzkpZitLaNXyKksfO9a9dyaA-RG5A.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-xZh8iV-reyMZa1ApoOysDHdmzD3BZv-4sbHIXfeGYo0.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-Exsrqxl6ecYapcz22DkR7fpLaK2wuk4IfhwfNVMeMx0.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Q6lF4sywdVrVSpWNVSNzElpZ4LXOOqT3nN-m_4qF0NA.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-3o31ufU-duLD2XiAUKbS48L65z7JGXcLZKb8pExPXcc.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-a-UOMKQpsFBO6Iiox4Y5G68kCuFglkLEbtVE2hXiqtc.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-OEVCHhUFgLfPDUixgDfp7GnoEp0qgjj4XFqsqBRcFEk.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5459958798148937878.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-kFY9E9ZDo22_O2M0Imyrhy9XAjs2VRMOWwYhn2D7i8Y.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Yd-eepn8LL7Ak7GvekwMWYOTi_w186uxprTvBA2qcRY.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-4hr1gYx7IMJN0NyCcaMXAwxhpPaYdi2bxDwYRVmDAsY.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-s9taiJvwrKzvgqzojTeX3gFJ3sBHmvz1jbXUV6sF2aY.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-bGcYeU_tFMH_t3L74-MzdLzy6-Y72miqLDUOMFk7JTo.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-5jIKFd4k7DgKLHzsSBOH7K9gVSxk3tAkuBIBxFVNbQI.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-iUI0B_42IwGSORXScL5agKrSaMhu7XHCkt23m5QJMtU.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-Jne4pOM6AGRIVLgodBcNxlovR_ZY0GuAxqwuPQWMeRY.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-fctGGcwt6bMsIx7sm9Rp7diQhifD0KUuf3hkkkUZO-k.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-xQGwopE7FrMFiUrQgixoLTn8hSjXJ6B_j8ZoRhxTgcU.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-0Bi-y5k3_TZW3rPDki8Pod8-gK52zKbu2ULVjYlZNNk.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-ZXH6zrl8bIh43ZKtSPb5yhiAaydE2BBaj2C1mXtByPQ.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-Xp24YPb3E3c9Oyfp0QSN3l7xNIQmZFqWQa7vCImEI9I.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-WVtEUZZZZZVbLmRlEYU1M2nSOwNGC5tFFs71zTe8S4A.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-IY0sLi1dnbyMmBlhYN5-O3dd9SijlMyKicK2AsKx4Kc.jar
    Oct 31, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-Dgz-wOGdORNgkppbV9oN2f1DC4Fz_T_7vUAjATGl4-U.jar
    Oct 31, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 31, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 31, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 31, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 31, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 31, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 31, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96046 bytes, hash 8ae45db9c88a91e2807efed3a2d41f67e9e225846591e3358598b662a58b7896> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-iuRduciKkeKAfv7TotQfZ-niJYRlkeM1hZi2YqWLeJY.pb
    Oct 31, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 31, 2020 12:45:54 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-30_17_45_53-11420003038945062906?project=apache-beam-testing
    Oct 31, 2020 12:45:54 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-30_17_45_53-11420003038945062906
    Oct 31, 2020 12:45:54 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-30_17_45_53-11420003038945062906
    Oct 31, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-31T00:45:53.563Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 31, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:46:04.029Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 31, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:46:04.714Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 31, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:46:04.752Z: Expanding GroupByKey operations into optimizable parts.
    Oct 31, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:46:04.795Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 31, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:46:04.873Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 31, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:46:04.903Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 31, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:46:04.942Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 31, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:46:04.981Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 31, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:46:05.458Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 31, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:46:05.545Z: Starting 5 workers in us-central1-f...
    Oct 31, 2020 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:46:30.368Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 31, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:46:36.147Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 31, 2020 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:46:53.319Z: Workers have started successfully.
    Oct 31, 2020 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:46:53.353Z: Workers have started successfully.
    Oct 31, 2020 12:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:47:33.945Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 31, 2020 12:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:47:34.129Z: Cleaning up.
    Oct 31, 2020 12:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:47:34.215Z: Stopping worker pool...
    Oct 31, 2020 12:48:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:48:32.898Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 31, 2020 12:48:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-31T00:48:32.942Z: Worker pool stopped.
    Oct 31, 2020 12:48:38 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-30_17_45_53-11420003038945062906 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 485a9b8c-62db-460e-87a6-6496cfd9caa9 and timestamp: 2020-10-31T00:48:38.462000000Z:
                     Metric:                    Value:
                   read_time                    19.834
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 31, 2020 12:48:38 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 3 mins 0.224 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 22s
107 actionable tasks: 76 executed, 31 from cache

Publishing build scan...
https://gradle.com/s/trdr6rykbqaoe

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1179

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1179/display/redirect?page=changes>

Changes:

[noreply] Update beam-2.25.0.md

[zyichi] update CHANGES.md

[ryan.worley] Write Schema enum to BigQuery string field

[noreply] [BEAM-11070] Skip py bundle finalization test.


------------------------------------------
[...truncated 281.45 KB...]
    Oct 30, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 30, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 30, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 30, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1621466119]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 30, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 30, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 30, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 30, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 30, 2020 6:46:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 30, 2020 6:46:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 30, 2020 6:46:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-FFDvt6sa5kzaM-cAiVitGAv9nvuH-j4qTezRzvSYTuU.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-VortxPYv8tcEMH50TtCE1v2iHEpoKR8UYWM-hAHqPUA.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-nBnZlvIK1DAfRAyNt22kjLJnK1vXbyKArOxeEVrm-8M.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-lj7sOVS3GKqlactYinxQSULcQIbs_jap7CM9Cjle0Q8.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-8omUr1LkRIscmYRcB504_ryQBJTwkuN4l4iu3vPDJIE.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-1IPLkeHzS3bLOGq4DSThayPp3EegcaRXxAJjc0KfmuY.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-tY-qp9eGLb2ofD6MxOgxG5eG0FrNhHEJt5FCXciB6ss.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-JMqhDeTBEleZlnHYA34Guo-zxoLCcJc5RjQocubvxPY.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-a0b7PpiR4ghPSTfBUI4W49Ld8D_DsBDUesCMOJEUeL4.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-2cg0sIv50Vf_sDeHvEuxIWQWHPcNnh_J-h7fwlxP0NQ.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-i2xh9Ba6Itn7cxgPeeyCAH1SBlloNanUuW7uWi4aVA4.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-PHaFSl6wcbIHu11mD8UC0ctEm6HLrxs5NKNgFTnnVWg.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-ykkQL0cOb58MVQa4_YBRP1w0-E-77ZQIXM-QOSUs-Tw.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-86RIID7NjexVIgPAT-3gRpNRSvnJpsLTRXQ-HebpNvk.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-bmse57Ytxs0c3B4D1aBVZ7jW_RZWfSpmaDfAae-WtCk.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4352329594049939505.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-XdonUExeL6m3mrnvKIvpsjLxaGHrAxek_6-7DH_xux8.jar
    Oct 30, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-FFDvt6sa5kzaM-cAiVitGAv9nvuH-j4qTezRzvSYTuU.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-kiwHkWlcPJQCIL8GT4WJ7WcsR3IumUhkqxKyN4Sp6Y8.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-2PkS_rRIfaAP5O40LzTyGOHDSwSfNIad7bHd8O1VwbE.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-LvUqo_eWwlf1UmHjUPTjjr2_5LJBburrPJSYgG_Q8m8.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-zSwF7eLIR_aM9MowMiEQ9ok0FGnMLlTnsY9F4Wcy0Uc.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-AKG9oDRKDrmEytaksU4zp2gxsTZ5CpoqzoGdIQWz1fo.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-xrTWnDUW58k2V_CDHy0d9gEWlroYH63eZjkAXQt2Ijs.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-reqPsKxzbDlCpOHu-ZpKcEkRmdwdscrKQ1FuqxonHYM.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-6ww6tIlVli5-i6PVQxDsSx0CNHchh47YvdJxPhaYmu4.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-rGO8-yKg9cTlFTOMSEj8y19touGdHqg3OA2oLeyg76Y.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-TbFzB7t585MTZl_DzFNrVADa8UvmkYsBn72VLZ4pcUA.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-urVCQk16LbJd8LaJzsZKwCTRnIoXv31XL7Hl_waSaFI.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-IFwOT8szq_c5rIlGLgKb4Kgf1XpTXOai9xlEvBLhHaw.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-tDWMkk2xyfB661oaAyF0DyvN7kyxIWppXmBmZxOWhk0.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-yraymVKiLDKBuZ8D21vKRxhY0TeIEHyCOhwlO_a_fRs.jar
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 30, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash e39abed740ee236f0fff110a2c0f7ae893c0528ec0d70a0be1103c49cc7660ee> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-45q-10DuI28P_xEKLA966JPAUo7A1woL4RA8Scx2YO4.pb
    Oct 30, 2020 6:46:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 30, 2020 6:46:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-30_11_46_24-534452730161571517?project=apache-beam-testing
    Oct 30, 2020 6:46:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-30_11_46_24-534452730161571517
    Oct 30, 2020 6:46:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-30_11_46_24-534452730161571517
    Oct 30, 2020 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-30T18:46:24.327Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 30, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:46:32.325Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 30, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:46:33.105Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 30, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:46:33.153Z: Expanding GroupByKey operations into optimizable parts.
    Oct 30, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:46:33.200Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 30, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:46:33.272Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 30, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:46:33.312Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 30, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:46:33.348Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 30, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:46:33.383Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 30, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:46:33.747Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 30, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:46:33.827Z: Starting 5 workers in us-central1-f...
    Oct 30, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:47:02.006Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 30, 2020 6:57:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:57:43.286Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Oct 30, 2020 6:57:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:57:43.345Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Oct 30, 2020 6:57:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:57:46.215Z: Workers have started successfully.
    Oct 30, 2020 6:57:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:57:48.651Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Oct 30, 2020 6:57:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:57:48.731Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Oct 30, 2020 6:57:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:57:52.201Z: Workers have started successfully.
    Oct 30, 2020 6:58:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:58:04.525Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 30, 2020 6:58:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:58:04.571Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 30, 2020 6:58:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:58:24.738Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 30, 2020 6:58:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:58:24.911Z: Cleaning up.
    Oct 30, 2020 6:58:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T18:58:25.017Z: Stopping worker pool...
    Oct 30, 2020 7:05:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T19:05:31.313Z: Autoscaling: Resized worker pool from 3 to 0.
    Oct 30, 2020 7:05:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T19:05:31.411Z: Worker pool stopped.
    Oct 30, 2020 7:05:41 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-30_11_46_24-534452730161571517 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 70deeb6b-44ca-4dfa-9103-32c1ac50251c and timestamp: 2020-10-30T19:05:41.179000000Z:
                     Metric:                    Value:
                   read_time                    24.135
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 30, 2020 7:05:41 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.038 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 19 mins 32.792 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 21m 24s
107 actionable tasks: 75 executed, 32 from cache

Publishing build scan...
https://gradle.com/s/tkgxl2tgsyur6

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1178

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1178/display/redirect>

Changes:


------------------------------------------
[...truncated 272.17 KB...]
    INFO: BigQuery method is set to: DEFAULT
    Oct 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1827110280]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 30, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 30, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 30, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Gn-JGj6UQmXQu4D3R3YfmJf56z1A1WaLxbbzUJhdf3A.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-F_6ndoR5p50R9Hag4TmUoyklaOHVfyI9S4O9-WD_BPc.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-OOEBPewfINAKou7ShT64VXL-XJPfUoLXsw5nrb3-7h4.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-PZp--cuFXYjtk_fcx-NQr_7ZNDKSOfD3F1-CEWEjY6w.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5581857469499481653.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-su-I_yTfQJxEZl-QmHVwycV9jqeofkIGy2J83WeaGGw.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-LpjV30l4cE-CKQhuWWVIUwxUv3KZ1cn91HuFNuOEmiA.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-x9aijCRdDa7xzM7qQQkUV2uD7HoGB7wHk5JDtn4tGbU.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-6KpAClLrVv2rSJ3BgvEC0gi47U_FEsi0PkG5RLmGWwA.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-jrhDQ4DttpYpS5eD4cXeAU1dT5uWlPdWkaRi9HthDtY.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-Jj26bPz5ObXTCdrdaO9Ds-SXuQgjPTmp_4UkyO3wU88.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-EKOfO_1Ez6QAh1iaI6edgPGLND2vc8tLsmBX37rf5aE.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-xzaqxlpMhq1jBSXgPum7V4Bbt8pVdtxlQCrhUJJlJJo.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-JxXGWDJ89ltNFe4DTS_SgF6gUdN1gVdaJLGTjgwFfWo.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-XTnk5g3lD-0IBeRfOWMwFU_Hq_MIqeuI3gShDxbTSSg.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-vH3GFWnMYfqK5AasMoFiLyjgxWGPzHXH0ePT9fNK6n8.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-svqdrf0dWIXWbltnLskigcQyStoWE2Ay17FpA3qgSKo.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Gn-JGj6UQmXQu4D3R3YfmJf56z1A1WaLxbbzUJhdf3A.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-q31V6gYoyjlEmXjFx1rL9qzWWlNAfwpDs-gy03beZik.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-PG9CfjCBwPYE-z7mw1wJlSFVyhdnan90uKUFcpGI67U.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-Ecw7Zyqg-tHw0gCyL19QMp6qFQQSeOyDuPNEIXqOmHs.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-HHblzOoE4sLmSgp68p4Z-LldVLUidft-e-zZoZv0Tok.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-CIqgbLr4RjFqY91KSsrr0IXgY1qlA3VwNWBWt8G46dE.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-Ih9Ke0ppHz7DQslxCzbezXdnRIFsxFwEkXewsFAlFJc.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-saPYLfAVsj4MC25KjZljhnzcjYU-3fRDMRx3ftXfyrg.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-RytUAaSWNjKK7uLn8Z38x0l4jMN2KKjjR1BMbXJMZ2E.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-NWBidPcjhWZUBwghjdtDHh4tiA77IZM4sISmKp-H6_g.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-A3fKyrFSUF9IiuJBJ2SYmeCybQKfu0xsJx2oVVPIREk.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-U09qiARZEYJh-MAt-q-Xj9xDoQ2eXhBgytkniAfUhEw.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-yKbOnj0qKvI4k2avjmPWZxZGnj0Xo8hiNngeFZ9zC_0.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-ZFdTRtO_NudpPbttEdba6RSORVHA6sqLYXOCD9G_pa0.jar
    Oct 30, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-NLXLNJFDX58kDaOpnL6AYsd3w1TmNRlFPM1_wzBPEys.jar
    Oct 30, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 30, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 30, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 30, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 30, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 30, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 30, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 9a6221871415f3e6d6f109cf2b4e93cf3debe73b924b9a8861f726418a5891c2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-mmIhhxQV8-bW8QnPK06Tzz3r5zuSS5qIYfcmQYpYkcI.pb
    Oct 30, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 30, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-30_05_45_23-14391923429886508694?project=apache-beam-testing
    Oct 30, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-30_05_45_23-14391923429886508694
    Oct 30, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-30_05_45_23-14391923429886508694
    Oct 30, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-30T12:45:23.110Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 30, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:45:31.131Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 30, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:45:31.833Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 30, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:45:31.877Z: Expanding GroupByKey operations into optimizable parts.
    Oct 30, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:45:31.906Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 30, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:45:31.982Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 30, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:45:32.012Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 30, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:45:32.045Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 30, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:45:32.078Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 30, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:45:32.436Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 30, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:45:32.513Z: Starting 5 workers in us-central1-f...
    Oct 30, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:45:37.441Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 30, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:46:02.970Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Oct 30, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:46:03.004Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Oct 30, 2020 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:46:08.292Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 30, 2020 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:46:08.316Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 30, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:46:18.812Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 30, 2020 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:46:30.845Z: Workers have started successfully.
    Oct 30, 2020 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:46:30.884Z: Workers have started successfully.
    Oct 30, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:47:01.864Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 30, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:47:02.032Z: Cleaning up.
    Oct 30, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:47:02.110Z: Stopping worker pool...
    Oct 30, 2020 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:47:50.508Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 30, 2020 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T12:47:50.550Z: Worker pool stopped.
    Oct 30, 2020 12:47:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-30_05_45_23-14391923429886508694 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a7fd3d64-8230-42bd-bf06-672bb7009611 and timestamp: 2020-10-30T12:47:57.908000000Z:
                     Metric:                    Value:
                   read_time                    10.878
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 30, 2020 12:47:58 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 49.018 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 41s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/6ipzq7ukmbkww

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1177

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1177/display/redirect>

Changes:


------------------------------------------
[...truncated 273.15 KB...]
    INFO: BigQuery method is set to: DEFAULT
    Oct 30, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 30, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 30, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 30, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1910743483]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 30, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 30, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 30, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 30, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 30, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 30, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 30, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-VJ4vqkPgPKD_rjaaBPGYwwT2iILWrnQYv4vubJZSWzk.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-YVrtDBaEAw9Q19Ads29tR3ChMoBw0v2akX_4vyAm9ZI.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-M78GDzbcyVW6GI7DUuyc3VPTfAxWgqm55t5yuFr78VY.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-QOixfSv6DT8EZlbVrnrDRfKpcX91dycSgthZGUXkHQU.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4995711881572641357.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-yyK4DxNNXV5F1YYrz5tqWakfRtMEeOMCFZqHHrMZkKw.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-a8vSCA1lxi-1ttH981IDlkbd-I4UA7RNtVxqO8LB4cc.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-FILgB4ytTvuDApYzcJG90-T5rKShlc8AAiaRw0qZrEI.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-jd2L4jq4yhA-X8XuVN0IwZi4a8fGNtN8B2to_bh62Tw.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-SReOUpUHnE8jvg4O2PXbXhTOYXxBGXE1iEIy_XnCxDM.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-UQNNK8yVn2da3jAgOdum14C-1C4Ma2I5ejfOp8NdGTw.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-pi72ws4qkFOCabHD2Zr7IL3VisiIt_z3KFtEukjfz8w.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-4ux8k-gIq0llSwlfp6a2jSr27b_bnlRtzwuMSznBlS0.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-rJdGVy2RPAyKGRhrHkUQuPxAZhjoGBKQPiolr26fkio.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-vNa6_kgffAOGW25i-bYCAUShqZIGqsTMluncYadUckY.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-klQXIPftjHO0QcF43oyChAzH5aKg2Zhz7ce_jeDlfFU.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-ijAmW43oJEmEm3RS9DeQvFId2N1r9GFP8OIIl80pLkU.jar
    Oct 30, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-VJ4vqkPgPKD_rjaaBPGYwwT2iILWrnQYv4vubJZSWzk.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-kaKRrGg-TngeVWzMaJkTMR3bTJGOBe1Y_0UZGSko0Uo.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-rkPMc_uwbCKOtXv7-pjL7SfYxO5gYngHmGtBoJ23M9E.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-rln5IlWnofCMAaeLjsHkgk8mMOPGtAqr8qKRFtH0NiE.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-AQNEUyuhtLM6V5AcAV9nPHjR4t5U0rQVNNS3_tBtn0k.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-xzW9oIv3Lxw4T-bcaQ3KkZM2bG9wREAqikRVUwYTr2Y.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-8mHPIFsEADWjr1RSvWiY2mVyN6RiMqHXmS3WK6Zb-wU.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-5MWzi6VSlWEaX0KQ3UIFbMTGSovlI7F3t0cur4H1XXw.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-w5kRtbrAQXoNMlXrkMlSY33rMIfiIgXgOQGRDd1yIKo.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-rOYM7zI3TeZ5js44Y7RVWwqf4ajL78qLV2JO5DxnrJQ.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-Z_romWvA6T4XSxr7qzzYImYE5xA2UpfGsnpQyIJtXqg.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-ZRfuFE70Ds7Am6JVd5xCdbJcTdjVPVJEIrotCzc3RSk.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-VGvzRz8CKmvuFDBaxs77W30jHb270tyuYMiMF2wYXkM.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-E00n7vsN4dQamGPYrfUe4WcjdmUZ0jvxtGo6URYFzdA.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-t_iOx-U_nMaJBt0DLpiPCDIJa0Ku_wx8uILsOFl6tuo.jar
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 30, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 353043930b09860b63f90bb903f321c83ddf1b0c1b9654a3f3b74a3e5dd5efe7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-NTBDkwsJhgtj-Qu5A_MhyD3fGwwbllSj87dKPl3V7-c.pb
    Oct 30, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 30, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-29_23_45_22-749955489418875936?project=apache-beam-testing
    Oct 30, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-29_23_45_22-749955489418875936
    Oct 30, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-29_23_45_22-749955489418875936
    Oct 30, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-30T06:45:22.320Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 30, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:45:30.637Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 30, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:45:31.309Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 30, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:45:31.357Z: Expanding GroupByKey operations into optimizable parts.
    Oct 30, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:45:31.401Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 30, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:45:31.471Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 30, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:45:31.500Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 30, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:45:31.534Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 30, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:45:31.562Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 30, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:45:31.919Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 30, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:45:32.018Z: Starting 5 workers in us-central1-f...
    Oct 30, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:45:44.357Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 30, 2020 6:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:46:02.417Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Oct 30, 2020 6:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:46:02.446Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Oct 30, 2020 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:46:26.032Z: Workers have started successfully.
    Oct 30, 2020 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:46:26.089Z: Workers have started successfully.
    Oct 30, 2020 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:46:55.104Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 30, 2020 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:46:55.155Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 30, 2020 6:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:47:00.448Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 30, 2020 6:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:47:16.640Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 30, 2020 6:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:47:16.798Z: Cleaning up.
    Oct 30, 2020 6:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:47:16.875Z: Stopping worker pool...
    Oct 30, 2020 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:48:07.613Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 30, 2020 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T06:48:07.657Z: Worker pool stopped.
    Oct 30, 2020 6:48:12 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-29_23_45_22-749955489418875936 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1ed6765a-9b40-4fdd-9ae1-926fe16b78b7 and timestamp: 2020-10-30T06:48:12.811000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    30.187

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 30, 2020 6:48:13 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 3 mins 5.047 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 56s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/symiv2pkkshhs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1176

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1176/display/redirect?page=changes>

Changes:

[Boyuan Zhang] [BEAM-10869] Make WriteToPubsub output serialized PubsubMessage proto

[tysonjh] Increase Dataflow V2 validates runner timeout.

[noreply] [BEAM-9615] Add schema coders and tests. #13209

[noreply] [BEAM-5504] Add Avro support to Pubsub table provider (#12780)

[noreply] [BEAM-9547] Implementation for drop, explode (#13139)

[noreply] Merge pull request #13210 from [BEAM-10994] Update hot key detection log


------------------------------------------
[...truncated 275.14 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 30, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 30, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 30, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 30, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 30, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1910743483]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 30, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 30, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 30, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 30, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 30, 2020 12:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 30, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 30, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 30, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-G_Jhfgb00m-QPCVNs5PnEpWbLpt_QstM8dwW_RBPl-c.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-b2VF5gnfDQPgMusCgm7C_EFMwpQqFiHDcHIzFJViFBA.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-ncWPphoPVejppf7CUpKJv4YUYHjEUDQP_r2ZY9bB5OY.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-_ci8sRGsTUKktY8F8owLPaZFRF3Qi2kT3QFfqLh-KC8.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-mMyT3VK2-m2oatogz2XaygnesY-yq0OetruOcAFw-0w.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-IQDU51jpxaTy1vNiYBj4MW8zuDfT2WLkFnt7sk2r-T0.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-Tt9ppPBOCVHSgytcmCwYCIATY3vW0Iuf6UnIeUl2cFw.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-oWhrJONwpMHvUrqqyL1r3kwz5fHvFmI_ey5-KQj111U.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-paKgmyk5BO-0wb39YJtDuVpeU3QZqex_wYyHNdgpuwk.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5403706780833234580.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-hlmhHK1UBT2ydQ7dD0gdxstCtUx4_J6q8BB9GkGtndA.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-IwlV5aGCtlje0HZ4pI4hf_dcaxNV5KXNqpVsE4Y1uDc.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-zCqqoavJ2Xwc3zxaMc_HtlSZ-7tVPeenqNnWlosqjB0.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-JFOnVoiI4YHHboG8eZdrUSEbEairQiYH9euuCKAP5Pk.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-qCK1z9vIM4pbU9NUC3IUFxUYN2jbOMIks7F0DIFIkZo.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-jN151FehxH0eUADRoJx7rNLDUgcWFWqqUfdXNFg658M.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-Hvkn-mxmBfigLyuLGujpk5n-Lp62DkRLyu2ghGXV-EY.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-G_Jhfgb00m-QPCVNs5PnEpWbLpt_QstM8dwW_RBPl-c.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-5t0fdahHv45C380SxCqtJODvFNLdlPKuSS3FSldDfSo.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-TCglXr2kQpmaB7GwgvJ_FuMOGSKhV4EH6InE8VOLx28.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-Ym9Dz4k5UQIul9tWNI1Xl_JLN8VHHAeUAKXy5sNiNDE.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-xe4RFR6V2nTxY5nlgPulBMdK9cZAkubWRsrGT5AmxHs.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-iWXGBhpDZKzbj6QHIU_TU8FZpRry7sIhKqvyVwDbDTw.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-jUg1XkHZUrLtqDjS05O-GbA0OvfMKriKrUmrGNQiu6Q.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-g2oYJJanBDpWKjBqdKrE9jxdhALCi0usgAUPx1--1Uc.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-_EmipYWdH7R40XGES-cmF-9YAaZmeorLfJfW9mRpDO4.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-dssvDfyaRMirMbDsBgt05I-qm7TsIXNt-7BEvdVMlGM.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-zj8a--zrxcfyYVuoySu31pZ4MdpZmR-A_lL6C1iJB4g.jar
    Oct 30, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-uq75QN7FYd5HNw1WdG24Bkvs_mx-vVLgN8_Fysq-qYo.jar
    Oct 30, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-07wHAI8m1R7t-hGIhArJ5BYjMjAEnsc0SJoov4oIBAQ.jar
    Oct 30, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-xGckgevRL18_BTzo8olVXFVIwEph7_I9mocHtzKgkTc.jar
    Oct 30, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-VbS9IE5wy4hqJjquIK13g_1V4KkX7KNE0RDmGfsFQrg.jar
    Oct 30, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 30, 2020 12:45:45 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 30, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 30, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 30, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 30, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 30, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 7df43d55c328847822f59c8c4b46ed7f8ee3d0b3d93dce127cbe3df9a690ffad> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ffQ9VcMohHgi9ZyMS0btf47j0LPZPc4SfL49-aaQ_60.pb
    Oct 30, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 30, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-29_17_45_46-565797141740540708?project=apache-beam-testing
    Oct 30, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-29_17_45_46-565797141740540708
    Oct 30, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-29_17_45_46-565797141740540708
    Oct 30, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-30T00:45:46.593Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 30, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:45:54.807Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 30, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:45:55.510Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 30, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:45:55.565Z: Expanding GroupByKey operations into optimizable parts.
    Oct 30, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:45:55.608Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 30, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:45:55.701Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 30, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:45:55.735Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 30, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:45:55.771Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 30, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:45:55.828Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 30, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:45:56.213Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 30, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:45:56.294Z: Starting 5 workers in us-central1-f...
    Oct 30, 2020 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:46:17.783Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 30, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:46:22.493Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 30, 2020 12:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:46:56.947Z: Workers have started successfully.
    Oct 30, 2020 12:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:46:56.977Z: Workers have started successfully.
    Oct 30, 2020 12:47:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:47:26.798Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 30, 2020 12:47:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:47:26.965Z: Cleaning up.
    Oct 30, 2020 12:47:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:47:27.063Z: Stopping worker pool...
    Oct 30, 2020 12:48:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:48:26.221Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 30, 2020 12:48:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-30T00:48:26.273Z: Worker pool stopped.
    Oct 30, 2020 12:48:32 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-29_17_45_46-565797141740540708 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 05ab0cad-3e8a-4f9d-bfd5-abc977f29927 and timestamp: 2020-10-30T00:48:32.452000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    11.197

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 30, 2020 12:48:32 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 3 mins 1.012 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 15s
107 actionable tasks: 70 executed, 37 from cache

Publishing build scan...
https://gradle.com/s/qydxmcjgzz2m6

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1175

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1175/display/redirect?page=changes>

Changes:

[noreply] [BEAM-9547] Dataframe corrwith. (#13141)

[noreply] Optimizes extract_output for 1 element accumulator case. (#13220)

[noreply] [BEAM-5465] Run cleanVendor task when clean is invoked (#13222)


------------------------------------------
[...truncated 328.53 KB...]
    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1910743483]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 29, 2020 6:47:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 29, 2020 6:47:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-x__fAiKttDEAcUUXzOMvIOZSCExGFlMSXLAKAEitY2s.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-L9ACrDZM04gm8298LDMdiG1-L5uk1geUfTt_QhuD-XY.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-AbUdGWWFlapenL10m_8t6ywuDOdbZs8svw_08a3fZtY.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-9Q-2Q_Z-1tB0U7kh0Kgkzi6PkTeeQQpVJ0dKqb0fyss.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-KkWX-wVOfPwPVpnksFv3Mx-eAm5ecMOtjyXU8mQ8OPY.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-4DfzlCN0wZlzOejFILVSqr8K0dCjfT3a0bPFNJTAq1U.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-aRDVZ4NMq-wEUsHQQccDDPMlTRHyBrnYQMOqZit5UWI.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-6byGAmv85ipbNWQNKPf1Nc040m9iIYFCDI6jDs9XJhM.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests--GsrbzN7wAqSJ7UaDqrPJ9Jekp0ikPGPRtFo9U1zuQk.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-DxU6aI7_yMBZuUWbZULwCppo66_JHHK5rl73fGaemp8.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-UFhcnrzKriSBuh9d2B97aQHuMR2MByJA-npUcaMzETc.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-6-2Tssd52IU4eU6xz2-o6gbD0JsWjZmfzBbEVoVkLVk.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-zAw7mur7AC7Zj7YxElQFo7i7xhPxQDbmvTd4y_wgazA.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5750827553121083375.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qRUXrVoi6mnlS61kCKwR40YxJb8s27Lu7S5B88dpfCk.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-Ga_zXN5VA3eQ9-fnk2h98MDi-UFuRAq54O1GEbSJ_VI.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-_TXaqibebJqQkI9MPMLZhkFpPUZHjh_seHy_bo5aZrc.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-L9ACrDZM04gm8298LDMdiG1-L5uk1geUfTt_QhuD-XY.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-N8SQHNqhussx9Vhksmu7HULY40omBExQywpWk8xQq1M.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-ZtZRrcJBqUelgWc-01v1-S5SHKqbadKfyCRyjGYGWSc.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-r52orphggr2bEvTUO8aoJgVsr4ESpINKWwkbqAlYyK4.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-5NRu_opXa798zvTek74c_UjNk0mRSQG3eWNgVYInhio.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-CA0meEUtYdUyPzawF2ShILDrGfXh3I2hZ7fUrqrU66Y.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-_b-OKX1t79NpO0AWi_C83AvolvS7ZgNJPCGMuv0ifdg.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-wnOROZVstH4rxB1Gu0_qByq8Wc5khC4XH8sN3cLXYxQ.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-gbgWocrRHJ99Mks9QfsVVtOG4_01KNqfV_zpoEnZ_9s.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-hzMoxPO9eHmknyRnCcpb06k4BP8-hRLJOJNQ-QkgFDk.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-zkBxU8MzL3S0mza0et1pQXzeyCL7VH_I1D32ZR51ibg.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-yWOo5SwM7DtUVKhtYA14w3ENJ39kGYVnZYbPzAm_HxA.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-lZrQ0UPnmxmXGAyVZc6b987mMw7SoOGjqXO1dHKxFn4.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-hRA46fnqqUORXqtU2wO5Her_PP_FSj2MpnZVo9_dEoc.jar
    Oct 29, 2020 6:47:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-TJAodnu9NHxKdNSTuVo4z-pA3AJiw-XN7mW2S0-KMMI.jar
    Oct 29, 2020 6:47:59 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 29, 2020 6:47:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 29, 2020 6:47:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 29, 2020 6:47:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 29, 2020 6:47:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 29, 2020 6:47:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 29, 2020 6:47:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 710c53c1afeb021df1da46213c09945e6afbdbe24383918f45f7be125f2a0661> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cQxTwa_rAh3x2kYhPAmUXmr72-JDg5GPRfe-El8qBmE.pb
    Oct 29, 2020 6:47:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 29, 2020 6:48:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-29_11_48_00-16658491067004759186?project=apache-beam-testing
    Oct 29, 2020 6:48:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-29_11_48_00-16658491067004759186
    Oct 29, 2020 6:48:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-29_11_48_00-16658491067004759186
    Oct 29, 2020 6:48:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-29T18:48:00.133Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 29, 2020 6:48:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:48:12.041Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 29, 2020 6:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:48:12.720Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 29, 2020 6:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:48:12.779Z: Expanding GroupByKey operations into optimizable parts.
    Oct 29, 2020 6:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:48:12.840Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 29, 2020 6:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:48:12.927Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 29, 2020 6:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:48:12.961Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 29, 2020 6:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:48:13.002Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 29, 2020 6:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:48:13.038Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 29, 2020 6:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:48:13.560Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 29, 2020 6:48:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:48:13.661Z: Starting 5 workers in us-central1-f...
    Oct 29, 2020 6:48:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:48:25.034Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 29, 2020 6:49:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:49:07.470Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 29, 2020 6:49:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:49:07.505Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 29, 2020 6:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:49:15.225Z: Workers have started successfully.
    Oct 29, 2020 6:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:49:15.264Z: Workers have started successfully.
    Oct 29, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:49:49.293Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 29, 2020 6:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:49:49.405Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 29, 2020 6:49:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:49:53.969Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 29, 2020 6:49:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:49:54.388Z: Cleaning up.
    Oct 29, 2020 6:49:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:49:54.488Z: Stopping worker pool...
    Oct 29, 2020 6:51:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:51:32.235Z: Autoscaling: Resized worker pool from 4 to 0.
    Oct 29, 2020 6:51:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T18:51:32.307Z: Worker pool stopped.
    Oct 29, 2020 6:51:37 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-29_11_48_00-16658491067004759186 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7a35d756-7968-4ca7-ae01-44e3a9098124 and timestamp: 2020-10-29T18:51:37.787000000Z:
                     Metric:                    Value:
                   read_time                    19.266
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 29, 2020 6:51:38 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 11 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 3 mins 53.165 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 23s
107 actionable tasks: 106 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/dnpokqmihcdey

Stopped 10 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1174

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1174/display/redirect>

Changes:


------------------------------------------
[...truncated 271.94 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 29, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 29, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 29, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 29, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 29, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1910743483]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 29, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 29, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 29, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 29, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 29, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 29, 2020 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 29, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-h29ub5AOizyiS7K0Z3f8ik94R407ivnODapKYWPDO14.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sNbcDKa3dCmhRsoIb9FSSGtENowpNyrDUkklsXqLx-Y.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-rZXvKyGzDHrGib91hgNYNRNIUvG5oKPiVvPHGGgn_H4.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8570504886480073673.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-64G4KXAgbed3qJrijVBp5zDXMo7tNJnZS70Vt5K7b6s.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-Rq5AGYn-03CQ_v4QT4YI3LJ9jQPBW1OTWz_SqVfwyUQ.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-JRwata1TegNLAno0cpIf0_tlnQjKCQjprvKrosqJmi4.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-rL4C_e3Lmu-u6k6TIeSymLIHWt9GH75Exs9XrXvUv98.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-1CJXPiP08BXPtBBU9J429Emx4YgRfZON4_0KioR0wrI.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-2XGL48TbAxSKbbvz19eZDvBW1cMp3I3gXJ4bHQ6B-Mc.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Qu_YvHahpnCJdIfrVa9-jrMudHbQ0Rj-B5oEU_1M__Q.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-X2KLvkqCupu9t9EIiL8grFwxb0E1ob1lb7buqx4uH7A.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-N7sEX4-BE9yRqErUD2nE7BDhEPEQNV2FOQaljYNX8lo.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-1QAjCrvmj1FfnE_nt2bMHiJFyc8dLWclIf2RrZuIzrk.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-SIdo2Djh3ixJj2IszzCgE-aUZ33mkt7JXZPoIyRbAcg.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-gXj98VGXniRzovyQZ7qoFwpULWDNH3NTKj8GoyT6YiU.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-igccG4IGcYe2nY8jWgVmjJ2hYqWuSpvDBHz8ecYVkf0.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-oNGefTr3PZHMs6HhV1BCyz8xwSVt6OMO1KNeFG5chA8.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-XIJrw8C9-7j-G4PJijg3_P2n5B0qcpew2AenC_Zuzr0.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-cTQZ5GFxKQKdcPuG51p_rq5maPgWowfYFS_nfpIoALU.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-GgtggIx5O5mskZnaMX0cyZgztYd2dtD3BR5sEerbyY4.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-Tl7uQQl8wqnfdZfENMBj_FnNi8g8yp9nynbZRlZG2wQ.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sNbcDKa3dCmhRsoIb9FSSGtENowpNyrDUkklsXqLx-Y.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-40HYqiINrNt5wnHBYBiYu5UVwdq0QiwGrELr180KZ1c.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-mROsYdIF2iCr2TyaWmuvbu7pkcy42UT8CEDGjK19yco.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-RkJLSx-N5CUnhDz3DEkDfuCYKH7ugHdc8ALKf3Gesjk.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-tSAlKRBnj7bk4EZY61J5M5sAcYVI0MECGZ2iK_Othfs.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-FSkX0ENOV2QCy561YlqF6urbiuC9Q5HQNPC8QC-jbYs.jar
    Oct 29, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-QkjgNxLnUYWNDW9iiTRIn4RSeflMiJlWf8EQNVPKglE.jar
    Oct 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-ytnK9txHsDcBhKpAOAA0UIXEM85j8pBPorrSTH_JSDc.jar
    Oct 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-_R5JUdlc4upFYvXgIeA3r3KCcpWW7tYV2tCZimZHpj8.jar
    Oct 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-MSHJYln9RVTpfm0-6PI06lxXVAxYKievpWxWlG6e2bI.jar
    Oct 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 29, 2020 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 29, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 29, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 28226bd228a73efd9763823f3b2d5bbf3b02f1d028d5eb4ddd6ab07470205d85> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-KCJr0iinPv2XY4I_Oy1bvzsC8dAo1etN3WqwdHAgXYU.pb
    Oct 29, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 29, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-29_05_45_28-607563578325299172?project=apache-beam-testing
    Oct 29, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-29_05_45_28-607563578325299172
    Oct 29, 2020 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-29_05_45_28-607563578325299172
    Oct 29, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-29T12:45:28.372Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 29, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:45:38.131Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 29, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:45:38.811Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 29, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:45:38.861Z: Expanding GroupByKey operations into optimizable parts.
    Oct 29, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:45:38.883Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 29, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:45:39.074Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 29, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:45:39.097Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 29, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:45:39.126Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 29, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:45:39.159Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 29, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:45:39.563Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 29, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:45:39.630Z: Starting 5 workers in us-central1-f...
    Oct 29, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:46:01.031Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 29, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:46:14.066Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 29, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:46:14.104Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 29, 2020 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:46:19.393Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 29, 2020 12:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:46:38.082Z: Workers have started successfully.
    Oct 29, 2020 12:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:46:38.125Z: Workers have started successfully.
    Oct 29, 2020 12:47:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:47:16.258Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 29, 2020 12:47:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:47:16.467Z: Cleaning up.
    Oct 29, 2020 12:47:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:47:16.571Z: Stopping worker pool...
    Oct 29, 2020 12:48:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:48:09.456Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 29, 2020 12:48:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T12:48:09.530Z: Worker pool stopped.
    Oct 29, 2020 12:48:15 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-29_05_45_28-607563578325299172 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9699ca18-6d94-41cb-b749-8e1f5596d972 and timestamp: 2020-10-29T12:48:15.753000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    17.273

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 29, 2020 12:48:16 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 3 mins 1.593 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 52s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/oiaik52dg7ak6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1173

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1173/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11108] Add a version of TextIO implemented via SDF. (#13188)


------------------------------------------
[...truncated 271.84 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1725863051]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 29, 2020 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 29, 2020 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-fbhrZaEuNuJ6wTzNY0bl1kcV0Qo48ciR-3LCa0VRRDY.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-sv4GIHch0ugUTifM8EB8HjyKvl6YECT66ssVoZRlYOw.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-XFu28fDUz8lUntd3YJgSgrYsTMwFJ-Hhie5naIBXNMw.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-KTlgvokN0QdSVVCXNmjydbKnvM6RTNwrDrZ_7Ff74fU.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-v9O8eH4L57FuTx-E8Yx2wPoL2b8zzXow6sgEHO1651A.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-Nrb-sed3V1mPBywjxgLjuKYMUCh2OTGDn0KoUXE1cH8.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-5u7xI8EttwQAa4vdQCymBA3VtpGR7au3zZsd2O-XjQU.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-ldjxP85ffAtH7ngOjPTfbPHW9tHqgvgVHWXzxk268SY.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-hmH80gpJnRfgbgXiwwxXObTkxK6WPQIpEvL2Dq5m93I.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-YJU7XlO2Z_qt-K8HGAHa60jH2CY2lnvCYWgGqgD7bxY.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-hpm2W2XFZ-AEBwmc9DY-GZ_hO5q2Rjv4pzNvlNmIluI.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1034469044724150658.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-3zt19AilgxiJNVZFHyI7QCyqd3WtAKshCzpc4kDmbtk.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-ziOX333U2YF_gMs3IQVjs7O6IzCqi539EqHNHSbNqmg.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-opWvDv0tfASyszv8f_wsQWVg-LBOzAGsFWubVP2pSY4.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-yYmyAk9T8gEcYmx_Y764ERDCxO9bGhQNApXF4n4cWFE.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-Q8_LemxolFtf1j-1OxgzHcDKxuPY7PzzYEY2SR1Rdf0.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-fbhrZaEuNuJ6wTzNY0bl1kcV0Qo48ciR-3LCa0VRRDY.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-W6e-TJ1fO4kCE5zIPi3DDClEGSZIqrFFckxtn6cM1FA.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-W7RBmR4zsG75GpnMqP1fWEHgXn1D0Wh61pKNCixgfEQ.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-ezVVkJsMWY24JK8Sxn30Ka3skaNI0jO5KFALmfuXArA.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-X64P0WyUSzRNRi9YnaXXZEU_RoG5ek1ao3nPsZpXbRg.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-Vc2sIryWPMzyZHxzH2uV4wBx0vAGMI7kqtX_LuV9-k4.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-ozEmeQ9E-gFuYAi2KF2uWqeNWhamTilFZy-47W0jMcM.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-F9XTD8Bf2Vs7fJnL8Vhxq7FKHUVGuOvK9n7A5XLXI_I.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-Fw8rHC3x8Y83j2uwwiXOX5YQygz_6vgkg4q6XUfsMJA.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-srrxneVJLWZ6Y9BcNr-P0IV0itq_dR-38_jL5-x32nY.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-O7kFJizTMoiJzmFFJxgGyUs_fiFPuLgoD-ZFKUZvWcQ.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-3JA74AYrWzObARP7OBNgITlGRSr6Qau1wMEboga9nEc.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-VH40XnWB_MmuwMshhWgAgvjLOAXv0yDvaivpgn_F-9Q.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-d1vDr2IXaLgM763_sw3lWRD55_noBa65DBsk3R427Sg.jar
    Oct 29, 2020 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-i-nn-aCRGNunPj9F1o8Kjd9LISpFScqPgf5yneNftKk.jar
    Oct 29, 2020 6:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 29, 2020 6:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 29, 2020 6:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 29, 2020 6:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 29, 2020 6:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 29, 2020 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 29, 2020 6:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash f0c29e83cc5a9fb744a79d6d4624fb7f01925d225b6d7733805aef52f7bf63ba> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-8MKeg8xan7dEp51tRiT7fwGSXSJbbXczgFrvUve_Y7o.pb
    Oct 29, 2020 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 29, 2020 6:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-28_23_45_08-3081742045127146031?project=apache-beam-testing
    Oct 29, 2020 6:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-28_23_45_08-3081742045127146031
    Oct 29, 2020 6:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-28_23_45_08-3081742045127146031
    Oct 29, 2020 6:45:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-29T06:45:08.952Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:45:17.259Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:45:18.047Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:45:18.091Z: Expanding GroupByKey operations into optimizable parts.
    Oct 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:45:18.130Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:45:18.208Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:45:18.251Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:45:18.295Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:45:18.326Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 29, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:45:18.869Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 29, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:45:18.938Z: Starting 5 workers in us-central1-f...
    Oct 29, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:45:28.115Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 29, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:45:46.726Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Oct 29, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:45:46.766Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Oct 29, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:45:51.986Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 29, 2020 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:46:07.225Z: Workers have started successfully.
    Oct 29, 2020 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:46:07.257Z: Workers have started successfully.
    Oct 29, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:46:44.485Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 29, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:46:44.753Z: Cleaning up.
    Oct 29, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:46:44.874Z: Stopping worker pool...
    Oct 29, 2020 6:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:47:34.252Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 29, 2020 6:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T06:47:34.301Z: Worker pool stopped.
    Oct 29, 2020 6:47:39 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-28_23_45_08-3081742045127146031 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 33c04d66-1148-4f5c-a9b2-e1db2720fc0b and timestamp: 2020-10-29T06:47:39.798000000Z:
                     Metric:                    Value:
                   read_time                    17.302
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 29, 2020 6:47:40 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 445 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.001 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.002 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 45.197 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 22s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/pbzpw4p76c42g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1172

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1172/display/redirect?page=changes>

Changes:

[sychen] Add max buffering duration for GroupIntoBatches in Python SDK

[tysonjh] [INFRA-20858] Update JDK name to match Jenkins.

[noreply] [BEAM-7372][BEAM-8371][BEAM-9372] drop python 2.7 and 3.5 support from


------------------------------------------
[...truncated 273.62 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 29, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 29, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 29, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 29, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 29, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 29, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 29, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 29, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 29, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 29, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 29, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 29, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 29, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Sv-J_x1AY2TwblYIOBECLsjTmwaqpv4XnHzyDPEqwh4.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-zFC4ahNU4rP0K19G8nQ8nbX-aq8tHxZ4VeXsAFYREMw.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-Pi4QVFf-_NZWwST-ZVMSljUyOFqKxnzW2x80f_0Trks.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-lrrHQh1z3LbtwoRW-b3hZfCYrJfmFjFReBtJfXGzSuM.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-GUk6GyqBrgumkxNElk-5KOEVSwNHhRc6U462jTBr6-k.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-c32GafQoUZgpcdshRXiaGeVutHSmjCfO3366pFUOEns.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-y2dPnbhySeKhgccWz8wMvUB_jgKQ74BMZrKIVr0Atg0.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-uPDnIdHwmVqLsn68eQL7oFXf50IyRdKsCJxQoSUlCIk.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-ybSoV3L77EzaWhxqJxv0GAG7db2w41otG-EQCFpCtaY.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3990530511947154830.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2AQczOn7YKOKLi2KDcOLJJ1VVT5D7ySRc0QsO8oyrZA.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-nGd0h3xF3_uacebaPdmF6pssnBAkTUhkjpfeegSXrTk.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-w3LqaiWexanOt15txF0q0UYWyEWJb0hk2wszBkoq16A.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-MtPGET1uqnz_dxgMiShZyM83BX_tGYYPNHWp2hg0j3Q.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-abXVMQ4hhr6gGgGOUivvuzxxWYjc2uVdzF1RbdeZSFQ.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-PyLFiodERdrGvBSfitb3KEAyFCqaFeGPNdTWPek6kF4.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-CmQAYL_UVWuLlNmyWuAuvI6SPewl4P7SoecxDWcTaHM.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-X5iQ4vllzvmj9RF9dENVO4Wx-j-uo8m2-m4K5SiaDFo.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-n8heThECYUsemK6jeSC23qTbFuIg_UqpEbGf51E-osY.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-i42eBj7jQOSCEKh1jLmPfB7z9i-t_0qheZLC3mNgizI.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-1fOV7eRDqiYrj4H-c7im6SfsEJYniaCYrZaIfCWfD0I.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Sv-J_x1AY2TwblYIOBECLsjTmwaqpv4XnHzyDPEqwh4.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-Mv2G3PcbLtnIuEY50iPe_cMXIlNBkdzfqNxg4NG08oQ.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-GzrezwNUuFLrRl2WJL46DW4IU3gX_n4aiBbDpv9Ucgs.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-H4TZ-DzwK7K6G6h7Pbbwp1LhfSWTpuOt0iOrP2cOJyo.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-ptjRo11TlnnutkepOxUHEA5aFTB7VIQncDIwnlFzWO8.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-8UG67KEn-8NgBRkWg2aDlMmxXk_IT7QxI-gvWgu8610.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-T6TZGknacREqIYqQJs5fYJI9palGZry18mHkw4MfRrA.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-COKPvFa02MDn0exrxpRtcTCr5Q3LveLMAW0jhmqNPZg.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-LSYbRjLjjQUiqx9eStE4uNlp2EBT07aSjJsfAE7R5zo.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-5iddpwLNtXJYFcx6G4HsfXnd7c1ZqKhungjbxe_JRbg.jar
    Oct 29, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT---USX6lFStygNclH0X5sJB6iNEIsKTIlQ-NKkiho_GU.jar
    Oct 29, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 29, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 29, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 29, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 29, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 29, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 29, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 7a636fc7b0b29b263e855098036f3b009d50f5d88dfcd710546e8b753d134fc9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-emNvx7CymyY-hVCYA287AJ1Q9diN_NcQVG6LdT0TT8k.pb
    Oct 29, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 29, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-28_17_45_42-14797180440468097126?project=apache-beam-testing
    Oct 29, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-28_17_45_42-14797180440468097126
    Oct 29, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-28_17_45_42-14797180440468097126
    Oct 29, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-29T00:45:42.125Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 29, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:45:51.173Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 29, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:45:51.961Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 29, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:45:51.991Z: Expanding GroupByKey operations into optimizable parts.
    Oct 29, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:45:52.029Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 29, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:45:52.088Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 29, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:45:52.115Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 29, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:45:52.169Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 29, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:45:52.198Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 29, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:45:52.577Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 29, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:45:52.659Z: Starting 5 workers in us-central1-f...
    Oct 29, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:46:03.962Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 29, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:46:19.423Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Oct 29, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:46:19.458Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Oct 29, 2020 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:46:24.784Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 29, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:46:46.275Z: Workers have started successfully.
    Oct 29, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:46:46.321Z: Workers have started successfully.
    Oct 29, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:47:21.923Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 29, 2020 12:47:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:47:22.132Z: Cleaning up.
    Oct 29, 2020 12:47:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:47:22.214Z: Stopping worker pool...
    Oct 29, 2020 12:48:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:48:13.391Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 29, 2020 12:48:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-29T00:48:13.433Z: Worker pool stopped.
    Oct 29, 2020 12:48:19 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-28_17_45_42-14797180440468097126 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7994f767-e523-41f3-af86-2933a3934922 and timestamp: 2020-10-29T00:48:19.996000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.483

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 29, 2020 12:48:20 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 52.62 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 4s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/t2xu2kcgumlq4

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1171

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1171/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Unset variables from failed assignments.

[Robert Bradshaw] Track line numbers in test parsing.

[Robert Bradshaw] Better parsing of multi-line examples.

[Robert Bradshaw] Strip ipython prompts.

[noreply] [BEAM-10893] Add Json support to Kafka Table Provider (#12839)


------------------------------------------
[...truncated 274.48 KB...]
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 6:45:26 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 28, 2020 6:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 28, 2020 6:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 28, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1773754872]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 28, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 28, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 28, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 28, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 28, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 28, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 28, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 28, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 28, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-j911HP7m6VWCx6f0iwE1KmieCrdtB4wTjZMNutILAls.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-uPGVnvw5y950HCNNwDhjw3tOQe_ve9fXatgU3EQzYd8.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-uDsLJomYEaxZ7mDJl-8D-HVl7bevcaiz2o3VMIapWyE.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-Dl-L_nIMJeQvNz1AHoFDDxyxBzMJZelAeiPeiIiFXo0.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-6cKlnxMyO_z5XuPa0DuYvi_PiRb7Yxadyf43J18hUuk.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-a2w6KA8_Yd7PHQA5EBIMLPlkbwDuYhTNYuGKVio3NLE.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT--lEhs87mKquTJRk8sFjR8klpe3rAKUtUe_T_p0wEL8w.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4297775642303956663.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-KkRgM-fvIjHW8vpZjdYLl95Ewin5rbPpB2ekCZFYEVE.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-ICLau6S_P9nkKhbEq1wzVMe7e8EoALBpFsT8sqjz3_I.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-ePM3WO1FbshIl2qQmJckh9v24QWpZq18Jw5SoUng9MA.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-qaYL05HSbCBwqMjexGRpdfj8k0FJ2yLijfcjNkAiOU4.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-Lw9y_pEKupg3r4SoowIubaIxMH90kvk8GxDZJPYJRjU.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-OvgFVJY4MyUiLV8Knb9Ad85Qp5y7WzB5kS3xtfkU-5M.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-TQLe2AJ4TfFRorOnDooVEkJ83EhptmWpvzSjbwtR5-0.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-svgzoKqZkJQR5NduCub50Kwfuy48bOpkGZjW6nptG8Y.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-C_PVh9z9bgNhu3De0sNl09rK1u7HMVG_aJNdgFHrwI8.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-j911HP7m6VWCx6f0iwE1KmieCrdtB4wTjZMNutILAls.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-xLcw5ZKDwYDTTE8D7m7qzlYKU9ozzopA-FJZFRAyXC4.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-rEszAaqWhm9TZP_DYjg7aKqF9IUcMB8p_kTKO5ilNzk.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-m1obSP7JAhaFn1DtwS2s5ZfXU4oA1M-YbyfX20YtzAY.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-XLOzVOYlDWDoA9p0kz2nIaX_g9Lr5BWc9Cu0rvP2z6w.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-IGMsTT_9V5Yequ5pNMKrtfdJ8cp1kGs1fG_yph4di3o.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Qj76krZeQAXKLhX2un11ODEp4L6Imhz6udSpm9DPdcc.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-Gyd0lIKAMSytrJw_bt3lcXWReCAEbOUUVoMfqjHQzcg.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-TKbu1S-EW7ui3EuDmBOI7yxgWhzQgan6gsmwtFYaVLY.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-r1oP-97ezp1_nF1ZKdPlQGl0vnrT2PYeJ69x9y-aQK0.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-iXU_CqwLKIVY5SDYOTSfNOaext00wEIp7DPGmEysh0M.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-K1vgAyNdwRkwwXppzoY10TKApwdhfkBYjmfZoz5OjfE.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-8j2r9InevJebN9VKn2rZAHZbl3E6K-49t7bdyzFq-5s.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-zF-H43f5XO3b262dR1Xjpj5Dm1R2jiu6-HnpcJlfQlQ.jar
    Oct 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-g57SLZ4xKS5u0Gc8m65MODpZiluRDkiz7-ZhCM-BmwI.jar
    Oct 28, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 28, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 28, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 28, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 28, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 28, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 28, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96044 bytes, hash d90e9ef9cf8beef065ef5eb5f547b2c930e5ae8405f1245a47c76ef0930deed8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2Q6e-c-L7vBl71619UeyyTDlroQF8SRaR8du8JMN7tg.pb
    Oct 28, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-28_11_45_32-16756087059821538774?project=apache-beam-testing
    Oct 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-28_11_45_32-16756087059821538774
    Oct 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-28_11_45_32-16756087059821538774
    Oct 28, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-28T18:45:32.826Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 28, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:45:47.370Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 28, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:45:48.011Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 28, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:45:48.117Z: Expanding GroupByKey operations into optimizable parts.
    Oct 28, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:45:48.176Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 28, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:45:48.296Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 28, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:45:48.342Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 28, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:45:48.383Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 28, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:45:48.439Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 28, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:45:49.214Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 28, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:45:49.323Z: Starting 5 workers in us-central1-c...
    Oct 28, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:46:21.892Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 28, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:46:27.785Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Oct 28, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:46:27.813Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Oct 28, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:46:33.125Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 28, 2020 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:46:33.191Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 28, 2020 6:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:46:38.519Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 28, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:46:49.316Z: Workers have started successfully.
    Oct 28, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:46:49.369Z: Workers have started successfully.
    Oct 28, 2020 6:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:47:23.030Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 28, 2020 6:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:47:23.263Z: Cleaning up.
    Oct 28, 2020 6:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:47:23.469Z: Stopping worker pool...
    Oct 28, 2020 6:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:48:03.822Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 28, 2020 6:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T18:48:03.942Z: Worker pool stopped.
    Oct 28, 2020 6:48:12 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-28_11_45_32-16756087059821538774 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 546df6e9-2f21-41d3-9417-b3efa490fd27 and timestamp: 2020-10-28T18:48:12.372000000Z:
                     Metric:                    Value:
                   read_time                    14.958
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 28, 2020 6:48:12 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.006 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.006 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 53.998 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 55s
107 actionable tasks: 70 executed, 37 from cache

Publishing build scan...
https://gradle.com/s/jpydv6lv7sp4i

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1170

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1170/display/redirect>

Changes:


------------------------------------------
[...truncated 273.12 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@159099682]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 28, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 28, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 28, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-IWvoOVgyYr6ledh43klzJAIh16mmxQcK2EJYFXyt2hg.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-b63wQPmaIvXojLZXDMbeYuozYbGG9oAIHF6TeVQkHgY.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-6vyBKHFTKhxmW7OkcmI4Da9QNYAC7oZeyQY7UiZDwa8.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-UMnxjnCRV-oqTiXOGePnDlYWfhYsAg9vn1jp5MaO30A.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-Njjiy_pgpOMPsZ0WA6psWBGj2h6h8U6-kGHFEw2yke0.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-glp6VfjXCThDjQb6UXEwCiuzGrSkrWXARExebLDwe7s.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-VuagPmrd1VXLOLQn17p58zdEB00BUMleuqxkwL0ymfw.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6337205181281604157.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-9TYfEEhD-NpY7urzsbJJpn5ZMm_ov1p7lLXTq9k_sY4.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-R139B50uiWTg9zq90zm9bsVambAaQZt-PPVv-OYpKf0.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-VxPvc7SM81d5-92vneGvAaAieLZNh9-hK4wkztjBdIg.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-e_OQuNdhx4WTmQLp3Pj-hqaSZFi5TFQfwxhXeYmRKcw.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-KuU6m90TxwAWUyuhn3Vl_YBIJoOk4vT8YvEqAIUGZbM.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Moh-UQUHfXmIToROU_ODQZKUYdE3qtqXJ1BajNBcZDw.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-XXW5twmdjx8GjRgmSs7vG4jYoxVGiYJJTDbL76cVgU0.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tVNl5WsFnuXEgH1vrF-PNeGueK5xt6S_z7YNlg5DRbs.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-oiPpfKkV04kmT9n6kosdJnNWhysjR3KGkNOWs3NH3xo.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-PmjNKn5InLKMGX4kIJp3qnskDPiJ8y_D2AOrfGNAa80.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-QYxi5lJc9sDN-APWXAM5j_PfuwDYCCMbrfZPdzuff-8.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-VamGq7BUKufX2LrlqqnxQcbQsw7GxtraLG2hJeA3Cdc.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-aAiRQyFnAwqkNl7yXv28ZTZdvD-K32Iu9Y7uwJlAkHs.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-d_yREhvM9PEswoGmYThOtVMUBg84Ll1gwZin_LeWaw0.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-hhys_-2JFcuZaDP2PH08hxVIVPvDVuV5Un6tCho-nnM.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-hTL_PiaTkNDfdPOvXsyQL9gXGMGrkklupnZff7mmwVs.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-cHoTALxY4lqdtElS1pln-zsQreLdTn7k1FjsLrtimr4.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-R139B50uiWTg9zq90zm9bsVambAaQZt-PPVv-OYpKf0.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-DS8KrvSugKugnqpWMrsDieVsAgWG0oyMYmfbHJcjwYk.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT--e6e5OMInvq6Dxe1JGUQUwdGbWUFhiPT93FqXomoe4I.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-nY103Ej7oQZsJ6SrUo6tzh0xTUGSIB2T3PXMrQ3dliE.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-PXm_1JabZxc3RJsN-BC-3PNcN4rLy6rdJbb8gOZR3NY.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-YOKgtYmRKyNdv94kAHw35rvViCgubkJblFDYgl5WiPU.jar
    Oct 28, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-e_bTGPPrfOdJusnqttHj8ZEzLhRsCRBVwrv-3Zsj8PU.jar
    Oct 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 28, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96044 bytes, hash 8e66321f196c521770c4412d29daa4094c3c2e40c4fa0102d6a6c55dc58d39d3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-jmYyHxlsUhdwxEEtKdqkCUw8LkDE-gEC1qbFXcWNOdM.pb
    Oct 28, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 28, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-28_05_45_22-5851414869654235868?project=apache-beam-testing
    Oct 28, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-28_05_45_22-5851414869654235868
    Oct 28, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-28_05_45_22-5851414869654235868
    Oct 28, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-28T12:45:22.389Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 28, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:45:29.199Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 28, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:45:29.821Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 28, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:45:29.869Z: Expanding GroupByKey operations into optimizable parts.
    Oct 28, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:45:29.921Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 28, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:45:30.011Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 28, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:45:30.036Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 28, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:45:30.063Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 28, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:45:30.097Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 28, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:45:30.601Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 28, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:45:30.686Z: Starting 5 workers in us-central1-f...
    Oct 28, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:45:49.683Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 28, 2020 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:45:58.320Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 28, 2020 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:45:58.348Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 28, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:46:03.611Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 28, 2020 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:46:26.298Z: Workers have started successfully.
    Oct 28, 2020 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:46:26.344Z: Workers have started successfully.
    Oct 28, 2020 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:46:59.206Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 28, 2020 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:46:59.435Z: Cleaning up.
    Oct 28, 2020 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:46:59.521Z: Stopping worker pool...
    Oct 28, 2020 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:47:48.712Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 28, 2020 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T12:47:48.759Z: Worker pool stopped.
    Oct 28, 2020 12:47:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-28_05_45_22-5851414869654235868 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ca8a6dad-df55-4e00-ab7d-0203e08092ca and timestamp: 2020-10-28T12:47:56.709000000Z:
                     Metric:                    Value:
                   read_time                     14.32
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 28, 2020 12:47:57 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 48.954 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 40s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/apghwtvv2gj6g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1169

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1169/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Allow conversion from PCollection of primitives to series.

[Robert Bradshaw] Allow conversion of scalars to singleton dataframe.

[Robert Bradshaw] Default generation of proxy for BatchRowsAsDataFrame.


------------------------------------------
[...truncated 273.68 KB...]
    Oct 28, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 28, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 28, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 28, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1773754872]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 28, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 28, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 28, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 28, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 6:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 28, 2020 6:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 28, 2020 6:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 28, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-PyW1i8pjk0ZDagqFz_ENB3S95VxL6Fd7ouGfnJuT95I.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-LlKflgmZn1rEU2350sI4GMV9ZntPfx6PhBRD_j6oCss.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-9xirNQ8IcW7JBIaN2WRq405wanrojfZ6JBP5o8pgmA4.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-9NU5TFwQEVtMNy_o9Xa2-5xsQhfgD5nAmDlPf3Embx8.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-lOAItF1YrMY8Vj_dbGAodjnf_uWCtQ8ytlZxTiMG92o.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-MGNA3PQ-s2k1pUlUVnDaqa8m4eL-b-hdPAc7TcZB9Xk.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-1B7yow4w0_FHYYseHDmDm6DrmmK7NCz0jVeecjFmZE8.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-xh69G0CTQi9jJTFshIA6SftQeZVTVEea0MQRFbcKHHo.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-Kz4kOY0xS1RiQ2fXJFA9pfYjBk6GqaAu8kav_MoFKfY.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3558154494019224424.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-QgPCwDOYWIXzteyLA9dGAaUX90ZpVcLcpPLG-adtDPA.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-dERsOVPr9YhJ3PcDZg4ipvvLQIfo_qucpUVHUidKU78.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-swOZkqKdzTw_SrgnPf77xi3fxFWJ9_R5nggrIpbbyq0.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-shlYj6r2gZ0E2RzsAYLcPyUuLuFtCE99_D4OCUzngbI.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-9vjV6H-IshtdGvNmsxXQZi6ixr1YZa7vCRQ3dXwB-Wk.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-tzL5ygQOcwEbTCS1doDwZLd3cCt45dRorIbjbD64Y3o.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-TMNLn7VrbVjpSr_GRqVEgYReS7GgIz6ZSpoH-7KnMRA.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-9xirNQ8IcW7JBIaN2WRq405wanrojfZ6JBP5o8pgmA4.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-OaJGXPoqNmraeIhHuQnZOf_KQeimw1e7MkKUNrlmn9A.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-LfhQeHwSY_UAlnPHe2ey2ecHMKHpg9K5r05rEOFV3tY.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-FKVZ0qslteEHdwBxmlgnDlsligPiVjPc0Zdv3F_590M.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-b_4BCLgWweuecyyRWGA4Z7jfstwbZYPM4CcO0mw0wgA.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-_JoyvJbrzdHOq4M0yagHVoHmMsYtd8Uc-uzDH9qq8bo.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-3X6bS6jDjte-HzZmtbOKWSQau4HqlASEe19IP_vhAJY.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-p9N-PCcO_jlMAhsCHbEMBBswxMslqCXdfd1FMJNUdGY.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-Y7CJiB8kHgvuJDCArNIfsh_3_IiQp6Hf6rsai83WVJM.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-E47xsnDLQZHENBoK7sbKse3s6HvX5J2Bs2UrOjELeEc.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-fIpBdtdTKclNXU1nur322iYDK7SrV7kx9EXFTPuu6ZA.jar
    Oct 28, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests--5hCdEU6Mo2z0CfJgxvhGXpYcSqbKzxK4evR4TpNBpU.jar
    Oct 28, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-mXAZk8aZXG550iTvzJkMQRzMiHQK8LXmHWwtvKH9pYw.jar
    Oct 28, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-g1KppYwuEaicppz-vJrMmVO1wgmN-tqpblYuA6S2ZyQ.jar
    Oct 28, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-WjY76Y09WlR5V5aS_PO2xmz9jZPWaExJf5APtkRpll8.jar
    Oct 28, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 28, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 28, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 28, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 28, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 28, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 28, 2020 6:45:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96044 bytes, hash 678b0632eacb254f5dd1f1724268e18de55be51a274cb56927a164a1ae4b3f99> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Z4sGMurLJU9d0fFyQmjhjeVb5RonTLVpJ6Fkoa5LP5k.pb
    Oct 28, 2020 6:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 28, 2020 6:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-27_23_45_24-14662296428337735994?project=apache-beam-testing
    Oct 28, 2020 6:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-27_23_45_24-14662296428337735994
    Oct 28, 2020 6:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-27_23_45_24-14662296428337735994
    Oct 28, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-28T06:45:24.212Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 28, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:45:32.184Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 28, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:45:32.830Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 28, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:45:32.868Z: Expanding GroupByKey operations into optimizable parts.
    Oct 28, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:45:32.898Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 28, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:45:32.972Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 28, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:45:33.000Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 28, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:45:33.102Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 28, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:45:33.135Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 28, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:45:33.497Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 28, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:45:33.565Z: Starting 5 workers in us-central1-f...
    Oct 28, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:45:56.157Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 28, 2020 6:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:45:58.052Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Oct 28, 2020 6:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:45:58.077Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Oct 28, 2020 6:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:46:03.295Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 28, 2020 6:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:46:03.322Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 28, 2020 6:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:46:19.009Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 28, 2020 6:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:46:20.521Z: Workers have started successfully.
    Oct 28, 2020 6:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:46:20.555Z: Workers have started successfully.
    Oct 28, 2020 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:46:56.814Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 28, 2020 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:46:56.977Z: Cleaning up.
    Oct 28, 2020 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:46:57.065Z: Stopping worker pool...
    Oct 28, 2020 6:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:48:03.640Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 28, 2020 6:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T06:48:03.678Z: Worker pool stopped.
    Oct 28, 2020 6:48:08 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-27_23_45_24-14662296428337735994 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 348df08a-3c13-4592-b46f-be456578b4e8 and timestamp: 2020-10-28T06:48:08.877000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.648

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 28, 2020 6:48:09 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 59.523 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 53s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/lmmchphuivlxw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1168

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1168/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Add checkState to catch graph corruption during replacement

[Kenneth Knowles] Fix DataflowRunner batch view overrides; give them accurate types

[Kenneth Knowles] Make Java SDK container pull licenses without gradle subprocess

[Kenneth Knowles] Do not read dependency report from cache

[Andrew Pilloud] [BEAM-11145] Upgrade to ZetaSQL 2020.10.1

[noreply] Add Select to schema docs (#13046)


------------------------------------------
[...truncated 274.48 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:544)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:496)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 28, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 28, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ZvDSXjntweP7EbxHanNBoQjU7SQDQpHi-8HbuMwOlJo.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4872893827818795035.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-NVZwmOzRlLKtlWFD_2obMzW4t6ra8J7mWn4qKO6Ovxg.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-ElIYYnoma-Mwge0yXQC4kf7gD8rc8JuyNXNVf3JIovQ.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-2Exc9aRlb2IHDm_wwXEkrvMcQfPNeYVwAetnx4btl3Y.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-KNwIXQL1OgMw_nwXA-EG19GlE3jVCyDSvn2tSuM0c9g.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-rSuPndDLcxDCkmfnX4ZxdK4EDn-Wo_oMLM9jix12SUo.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-CL848vJU1_Lb_o3lBdKoupa_0cW-yIe9Fm4z6_DC4CQ.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-4opMbGkesKXWtwVBzcyGjM0HuRm0RZh3oYZgGH2oCJY.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-TZdoyCX7Fcoofp-uwYLFRKUs6CygRPSdYPqKBUWkKSw.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Gx3ue0qy6FuDQty4P8aHuwkO3PRU1KHhfsW-46bE0kQ.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-lhf1PYo8S1bawGqdQyRtN22y847ouaG6NsvhF5c6t8s.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-uB84WcHahDaK5EKEZb1kK000a2FaTaODlICkLSqVFdo.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-d_rddlLDzKbOjnSeUM8neJoWyfwOUC1bPcIF4MwBm08.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-pT43wjW7k5s7MC6VMzc9T2aCFNo6zrMO_n5MR9M6XvI.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-Td-GPSGQ7YpFQDMIl2ddkq_TIrlwJEy4WJjnULC0VDg.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-c-c-pL-NvrM1tb9pIixCB2v_dnQ2SJW7tBB4YxhoG-4.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ZvDSXjntweP7EbxHanNBoQjU7SQDQpHi-8HbuMwOlJo.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-LiHsG89MusfgtklOnbBNiA50w11SCP8SRvtPh3T1LEg.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-sn6du0-o_qv8kbmVgdZFpRseq85hkKjEJKMMH40az0M.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-gbcfIJA3-8eA54dA8ZZYoBqKwtbwsjW5BgMq_UK2-Ko.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-TRykxMt05FlVeh1F_R-2HHKkuLpygGOGPFzp0MCfdeE.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-XREKzZm7tQP0fexcWdpcntVF_67YD27LNKaHvFUG-Gk.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-6NLfdTrc6qlvprgvo7uZZa9PLr_Qp3sBWiWhqDauEDM.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-QmEYlmkd3GEefRXsB6aBftUJ5gaanGT656ki7oIu50A.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-0QmO_rheXHmayz6paa7Gus45Ti0r-xiht5C2SRUFvpM.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-BbB2j7z9jwugnk3dBeIiVZPSwdCPe_8KJsyWCUuiIsQ.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-89MWF2vlc1sXYVnRgqVzs3k3BBKzKcvTS5BExignm_o.jar
    Oct 28, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-ZMTA0uFckkG4v59nH91DoWMui9cN-0jgT-bFu9qOoC0.jar
    Oct 28, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-h9xZnVIFmOP1htNuTtPmwETIDOQ6B-iTDbHHPc30WCU.jar
    Oct 28, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-wGlzhPdFcvbC5scIoQITQx_aH191UC2VaUylj-nFeBM.jar
    Oct 28, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-BQiWZQryDcayeauvHeI0Oau_OoGzYkflJ3K_tCJ-31U.jar
    Oct 28, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 28, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 28, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 28, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 28, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 28, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 28, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash cc9f7d7d2998c418489d2f697610637df49901aa07b86d5547f2e7f9d18b618a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-zJ99fSmYxBhInS9pdhBjffSZAaoHuG1VR_Ln-dGLYYo.pb
    Oct 28, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 28, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-27_17_45_41-3563317175113850044?project=apache-beam-testing
    Oct 28, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-27_17_45_41-3563317175113850044
    Oct 28, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-27_17_45_41-3563317175113850044
    Oct 28, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-28T00:45:41.788Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 28, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:45:49.414Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 28, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:45:50.184Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 28, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:45:50.235Z: Expanding GroupByKey operations into optimizable parts.
    Oct 28, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:45:50.267Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 28, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:45:50.356Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 28, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:45:50.400Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 28, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:45:50.436Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 28, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:45:50.475Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 28, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:45:50.924Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 28, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:45:51.024Z: Starting 5 workers in us-central1-f...
    Oct 28, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:46:18.492Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 28, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:46:18.539Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Oct 28, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:46:18.584Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Oct 28, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:46:23.830Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 28, 2020 12:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:46:41.457Z: Workers have started successfully.
    Oct 28, 2020 12:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:46:41.514Z: Workers have started successfully.
    Oct 28, 2020 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:47:15.837Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 28, 2020 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:47:16.074Z: Cleaning up.
    Oct 28, 2020 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:47:16.175Z: Stopping worker pool...
    Oct 28, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:48:12.407Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 28, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-28T00:48:12.464Z: Worker pool stopped.
    Oct 28, 2020 12:48:17 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-27_17_45_41-3563317175113850044 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5e8e6dcc-2808-47c4-b5c8-0e2dab1938a5 and timestamp: 2020-10-28T00:48:17.726000000Z:
                     Metric:                    Value:
                   read_time                    14.488
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 28, 2020 12:48:18 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.117 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.094 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 51.936 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 3s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/grlv2lyn5vdle

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1167

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1167/display/redirect?page=changes>

Changes:

[Kyle Weaver] Update release guide instructions.

[Robin Qiu] Update SupportedZetaSqlBuiltinFunctions and support math functions

[noreply] Merge pull request #13180 from [BEAM-9094] Configure S3 client for IO to


------------------------------------------
[...truncated 275.28 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 27, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 27, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 27, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 27, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 27, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 27, 2020 6:45:53 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 27, 2020 6:45:53 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 6:45:53 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 27, 2020 6:45:53 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 27, 2020 6:45:53 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 6:45:53 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 27, 2020 6:45:53 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 27, 2020 6:45:53 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 27, 2020 6:45:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-j0SQ6nbEuaybFc90Ik_881KRmfArGypolYIPRZmFZUc.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5653086721064675270.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SFbzOrnuYznK0UD_YJEfmWvIsBXGNRn_v_zeQS6JWJg.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-itMSlicZVENjV8urwU0wGRSiifbxWEakGoVn0rzIaGU.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-PLtYd24EB-i3f0i1daKr9ARQ--SrgOZdHArhfWvHcQ4.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-3IdJXcaISzgy0Ap9DOlMzO6EzY5-ihcoFcBtigA2j98.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-Q_1YLJu0MmG2MW2IUsbvrvUnBOLOmbbzvZIhrAj8h4c.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-KENo1GgEV_1i7bymVkUP2NF2tyCuxwCwIWOt46MXyc8.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-KMEFhaoHGRWJhiKen1g8P8pb5NAs-xy8vN-myhbMAO0.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-G-MZQKgNnNkrtDnnXtcNCypDo8cs7OhkwKg0MPvPMpI.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-e1xKmzWhuxNbZJaNACu-K7gNe2J4FmpOpCqJVUlhmF0.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-Ohv17NdRfz_2X1pGvAJXttRFBnfXE6RIHGCIygpFwTo.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-F4njFgBtGekS_7rXMLH0I9j8-TGGv1qr7Tjtnrpvlyk.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-KzA3I4N-3yhS4Gdio-A9HoZSiCgHtLzV7l2M5BfR3s0.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-LkO5zDhj4fxrI5WOKXHVNdySsGetU3ltlHtQTn_BAoU.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-hRYAYt8Ae7Oy-CFACvPDeV_qkKawvEc-9B9ky3q_bD4.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-5J2omCs71DT0L96RxN9RXwt0XszKhgeJyQHyMbicD4c.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-yD9ptjkiKCLdVFADY9DzbmiYQv-npre-N3SipDF4l-0.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-PMnPymXpJ0UmvipILhGg5P_ldHzuSlkL2FUgoW377Qw.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-jJnOPUSqP3GbbSJ93YgKjP0_b27IsndqdMQPX8tkKZM.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-PgQ9N0lu0sAITTO7dJ4wYhFCInPoH86DNig5aS01Lfg.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-ISJsBWrAKsPfZZRkm0LNxtWZF70Pics4q1cJQO3cj8E.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-oIXxgVJipGOZwkR2JQ1m2Adr6Rgb-Q68eDzBAgSlXFg.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-33E_u1aVeCNb2Yfc4rGbJ2ijusBBir5uns_vmUQIau4.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-kPoRDV5mZEZaxFx5b55CtdtRZFpEayhpndZOQhUkaQA.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-N9xKag3LUzibUILdaNLM_yumy0AaqlE1s1MdKS_cSTo.jar
    Oct 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-WhdLMhxmHPQQKSDH-7cmdWkVFRDmPOzpDPKGG_y0aqg.jar
    Oct 27, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-2Ljsh_sZ-cromD26dDnRj91QTPvag4Hx9ILj6oZSSXc.jar
    Oct 27, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-5J2omCs71DT0L96RxN9RXwt0XszKhgeJyQHyMbicD4c.jar
    Oct 27, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-3sSHKpSCsO9AVE8JY2Ty05eu7C7Z4RmCKqX1iEfRhYU.jar
    Oct 27, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-WE1wue-s068IiyIIQRb3DdG6RzCfK2vc5k8T-mUtVsA.jar
    Oct 27, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-9c8tBqz0oC2uzhDHN2XJtovL7TAFM2yjKCMp8Q-QnI4.jar
    Oct 27, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 27, 2020 6:45:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 27, 2020 6:45:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 27, 2020 6:45:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 27, 2020 6:45:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 27, 2020 6:45:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 27, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash e052904f25730df2b20b584185f00d3091ad2523571432b6114d5ec55a8dbaf0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-4FKQTyVzDfKyC1hBhfANMJGtJSNXFDK2EU1exVqNuvA.pb
    Oct 27, 2020 6:46:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 27, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-27_11_46_00-9185574335347165709?project=apache-beam-testing
    Oct 27, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-27_11_46_00-9185574335347165709
    Oct 27, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-27_11_46_00-9185574335347165709
    Oct 27, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-27T18:46:00.141Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 27, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:46:07.914Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 27, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:46:08.766Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 27, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:46:08.890Z: Expanding GroupByKey operations into optimizable parts.
    Oct 27, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:46:08.949Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 27, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:46:09.043Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 27, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:46:09.082Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 27, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:46:09.117Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 27, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:46:09.149Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 27, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:46:09.623Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 27, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:46:09.703Z: Starting 5 workers in us-central1-c...
    Oct 27, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:46:26.845Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 27, 2020 6:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:46:35.342Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 27, 2020 6:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:46:55.679Z: Workers have started successfully.
    Oct 27, 2020 6:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:46:55.738Z: Workers have started successfully.
    Oct 27, 2020 6:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:47:36.931Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 27, 2020 6:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:47:37.119Z: Cleaning up.
    Oct 27, 2020 6:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:47:37.226Z: Stopping worker pool...
    Oct 27, 2020 6:48:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:48:30.397Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 27, 2020 6:48:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T18:48:30.445Z: Worker pool stopped.
    Oct 27, 2020 6:48:38 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-27_11_46_00-9185574335347165709 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d7256843-cd15-40a4-8b65-7a091b7a751c and timestamp: 2020-10-27T18:48:38.122000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    20.918

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 27, 2020 6:48:38 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 54.189 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 21s
107 actionable tasks: 70 executed, 37 from cache

Publishing build scan...
https://gradle.com/s/3a3wjhndi7iqq

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1166

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1166/display/redirect>

Changes:


------------------------------------------
[...truncated 272.11 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 27, 2020 12:45:33 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 27, 2020 12:45:33 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 12:45:33 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 27, 2020 12:45:33 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 27, 2020 12:45:33 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 12:45:33 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 27, 2020 12:45:33 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 27, 2020 12:45:33 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 27, 2020 12:45:33 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 12:45:33 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 27, 2020 12:45:33 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 27, 2020 12:45:33 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 12:45:33 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 27, 2020 12:45:33 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 27, 2020 12:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 27, 2020 12:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-GWr_xrE4OsU_yKMoE49qQSYTS8eule9JhrGw_Kt_wcc.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-IUPxfN9TwYuHThsfwzVPruiBWl5FaBoNksHvBO6qokA.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-OlxucYCgRi9vDhLrf9tDfls_a0AMJqFWN0crdN_1iEg.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-EOm7R-m83H3g6BQJoZ6_Y6tJbxHSqViapgEncJTs630.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-jxDwvlRzn6Fp905ARUcCsTRlRV6OR4XANyC6vI7VkK4.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-EoFuiiVXjQ6_3x8pLdIeuF0KYMydGRd6tKj90_T2-Mw.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-E2EZzQnXN24NrMIrO-7p-hOnpzO8frrZjrtKkjXrMsA.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-bA3b5RVI8hZMKEwSA7gIPstU_-k4UTmMcvai01wckI4.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-djlDc9vFKE7pCskGuwDzyCI2aZd_Hdnl5npCJekim_Q.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-yYQC0QTRo1StA363Dx9lSA3n-5mCQ2sUO59mAAdYXFM.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-vql4glRBa34gSUB1NXi0RzvEXtF_lM-BFHE9Jbc3hLU.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-KUSelyJMWlj2mapI2B-QHmu3wQsdLq0O6rB2cGLyeFc.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-UYBbEsU8jn7HouXN_QOEFfR5wVwMneb-HlpzGvMemIs.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-2hxKmH_DsuH0JjQIDookeKNcRKQJHtwb08iN4C1o90k.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-4uXDJnoRv-0TXDenx4Z7ak1GAShSsUg8PM39p1Yjl6g.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1821736526016424028.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-_4_6MJ3vQJ77BrWouQqweLWGYG1yvo7yF4DX8Z4JfXM.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-o4NZ2XdA8Q5Yd5blX4PDX3Q6RVTvUZ2UgspUi19Tw04.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-Qq3Gb9YSQ9mUfYmnlmVYrRefYXt6Vzg9Tb30PUrrS4M.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-bWvygD_4-LJSENubyEJQiyqGjmFdkn5lzxRx-DtEdkY.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-z3z6VdH21Tz9udkXXStB-XCtJfR_BYMy8JtwQcOi3oI.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-JfP2B6tT3nO-iqW4KzsMSy6YI37CMlEmRsj8nbPEUl4.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-1HVDaZ4v4Xziq3KuuEi6v0t9HzvToy5rN3ylI1DQWBI.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-nj_Up3llG3I_eNG__8vU7N4S-hg0Zo2wyZF8Wru3PLM.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-VjkErMQ_jJWrIOaust7zPqz8G2_w-5h1EMbBRALlbcU.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-hnNdCHTc3SwoDTTJoVVJWvHzbxBJh2QLZYPFrEPgiRg.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-vnTeyMKXWCm_Xiv_WNx30TpL9Z7x3Z5RdWGjEGVq3Uo.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-6bWT3q1c9T0YPiQvOn8Lp0mbnXoWHCtMkD2Eyb2H-rM.jar
    Oct 27, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-djlDc9vFKE7pCskGuwDzyCI2aZd_Hdnl5npCJekim_Q.jar
    Oct 27, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-66h8OGcAN5FGnmVQmh_V41-J5BSfehPFPxjYASHgGI8.jar
    Oct 27, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-TEnUho19lpfOIelRx9JoW1blZIaEH7cet8N0qqK-1mU.jar
    Oct 27, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-ZczAEXii3NMzT11tHkLq12eVmALQqpJQVbFzJO6C7Kw.jar
    Oct 27, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.mongodb/mongo-java-driver/3.12.7/1f45c6a397feeb46da75425619333d1cc6f90f78/mongo-java-driver-3.12.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/mongo-java-driver-3.12.7-D_zgBJWNb9mzmuetJ37a0X9XtpcfSGsXYpxe6eE8Tao.jar
    Oct 27, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 31 files newly uploaded in 1 seconds
    Oct 27, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 27, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 27, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 27, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 27, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 27, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash f328e1eb0a50e8bb409c6957e7346248e84e50a0980ed7707dd73ffa16809598> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-8yjh6wpQ6LtAnGlX5zRiSOhOUKCYDtdwfdc_-haAlZg.pb
    Oct 27, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 27, 2020 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-27_05_45_40-14895917271299975503?project=apache-beam-testing
    Oct 27, 2020 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-27_05_45_40-14895917271299975503
    Oct 27, 2020 12:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-27_05_45_40-14895917271299975503
    Oct 27, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-27T12:45:40.400Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 27, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:45:48.384Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 27, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:45:48.928Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 27, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:45:49.019Z: Expanding GroupByKey operations into optimizable parts.
    Oct 27, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:45:49.046Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 27, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:45:49.153Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 27, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:45:49.191Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 27, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:45:49.246Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 27, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:45:49.287Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 27, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:45:49.901Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 27, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:45:49.963Z: Starting 5 workers in us-central1-f...
    Oct 27, 2020 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:46:14.071Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 27, 2020 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:46:23.256Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 27, 2020 12:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:46:37.620Z: Workers have started successfully.
    Oct 27, 2020 12:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:46:37.651Z: Workers have started successfully.
    Oct 27, 2020 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:47:09.451Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 27, 2020 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:47:09.638Z: Cleaning up.
    Oct 27, 2020 12:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:47:09.724Z: Stopping worker pool...
    Oct 27, 2020 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:48:02.326Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 27, 2020 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T12:48:02.368Z: Worker pool stopped.
    Oct 27, 2020 12:48:09 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-27_05_45_40-14895917271299975503 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b82d3c02-ea97-410f-afb4-b3a4811b09a8 and timestamp: 2020-10-27T12:48:09.121000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     12.96

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 27, 2020 12:48:09 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 46.844 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 51s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/ocxqjrhg3vux6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1165

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1165/display/redirect?page=changes>

Changes:

[tysonjh] [BEAM-11130] Exclude OrderedListState VR tests.


------------------------------------------
[...truncated 273.70 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 27, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 27, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 27, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 27, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 27, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 27, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 27, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 27, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 27, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 27, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 27, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 27, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-c3U-9kzu9lCFzfCoqU_W5anpFGfiBvbDph9A-IhSmSk.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-nzM-C5DB78DMwVWDUnOnHsNi7rAqHVAxwkZVt3HczDQ.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-zVCVNFUMtwQ9bs15b4ciENzjcAhe-eSJpc9bXfUdTNM.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-f-jCTPs1kLUwpsXt_83PTlxXWwNVeSVJjCEWw6UeLKE.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8351152331345615137.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-pS6vpO__c7d64FunTf-JUbGboqtizZHikba65zTGtGo.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-S64T_AlBkaCEEC8b2jYXbgGvmOwQyY-VjBD1F5qxYWQ.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-jPOJ_PMej4U_2tITfXUebqlxYUlOR4ec9shqigGUwM4.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-YD6ibv2PTKzcdl0gmCc8Dauc_A2icK37nPft8P1MNVA.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-bHtb6DtbkjAeljxVf2uaWuC-MU2BHe05s7oelaXq5HM.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-z3jXou1eKak2D1j2pZn4c-02J6X_txF0-16kCf0uQcM.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-ZUQjbfvVjpl3R0ik_SDebdf8V4IoRjfpm_0SRazxXNk.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-LiA_HWD35AC7OhIbAYXnP2oog3iXMUo2u76Fdb9Xoho.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-RnzkwLN4gbXSQbWK72MhrhQSLOV4MROnhItdreKKDNA.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-DnZTdaHKv2qsYq7w-ixV2kt6EW2hQjMPW0QfGEAdZdo.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-aOob4DaO3pg2hMFUIq5aBSOMR_RdLxZgCtJWNdqEYrY.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-JVDOVOs_RIYPfujXqXcx8ixAkyxca05rznTsb-0W-kg.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-c3U-9kzu9lCFzfCoqU_W5anpFGfiBvbDph9A-IhSmSk.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-gBusld1GvUM6v7ttXOIbDMs0jCA1FG57e3gm_obEgYQ.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-dU-UHrkVwWsWx3jHvM4vgUQd0HmO0CamDMiyI686Q8Q.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-jw9hEBwhQ1G0TCF3TPckiaAqO6w9r9BCNm8MjwaW3Hc.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-FW6ql8QtPrTG-nzt8KLycgrTrrei1UCOnn_ei4661Fc.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-AgFoMjW7diVyTZr6jF0Z2wov-p2-o_S3zNrLSKRd9bg.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-2KZG5YbtMyrkzcfohp0rMAjKi7HntWzsTe5ERS4oxUM.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-3tGNWDI_V1kEn8uQUac-2mi6Ui9fGQup8xhCdpVqWk4.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-KzgDX11ZZhTgO92fl1MC0ACkF_IS-fd0SI7xFlU72ik.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-eXfFMWzJcjFBL0C-kz0FHVf_6ROsLOSchKEwPaEs-Gg.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-fDq9tEOpSsTqx5DPo1nj2i3FjntLpRunZS3a32FpwuE.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-vdhh5GambR1HN0lBABam-RaZ0g47NOenXQfoC28a2m0.jar
    Oct 27, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-hnLTy9gwcoDziJOsCNwrOgZ7FoqtenCydAoqD0jD6KY.jar
    Oct 27, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-QyeQN38qEzCLzlQEApa2FHAACzl2P9PxP2e_Aae_D5w.jar
    Oct 27, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-jBARYqYVhrtHb0Qw1MVMX-f3_GAOSHJymF4d6Bwi4Qo.jar
    Oct 27, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 27, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 27, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 27, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 27, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 27, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 27, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96044 bytes, hash dd353b4dc07991ced10a265e997f159d81650413e00c18cd08257c6941b53cc7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-3TU7TcB5kc7RCiZemX8VnYFlBBPgDBjNCCV8aUG1PMc.pb
    Oct 27, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 27, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-26_23_45_17-11859952539683990612?project=apache-beam-testing
    Oct 27, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-26_23_45_17-11859952539683990612
    Oct 27, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-26_23_45_17-11859952539683990612
    Oct 27, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-27T06:45:17.071Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 27, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:45:26.155Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 27, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:45:26.728Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 27, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:45:26.768Z: Expanding GroupByKey operations into optimizable parts.
    Oct 27, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:45:26.806Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 27, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:45:26.870Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 27, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:45:26.898Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 27, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:45:26.930Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 27, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:45:26.962Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 27, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:45:27.295Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 27, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:45:27.359Z: Starting 5 workers in us-central1-f...
    Oct 27, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:45:53.659Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 27, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:45:54.862Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 27, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:45:54.892Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 27, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:46:15.996Z: Workers have started successfully.
    Oct 27, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:46:16.031Z: Workers have started successfully.
    Oct 27, 2020 6:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:46:31.328Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 27, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:46:51.220Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 27, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:46:51.393Z: Cleaning up.
    Oct 27, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:46:51.468Z: Stopping worker pool...
    Oct 27, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:47:44.136Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 27, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T06:47:44.181Z: Worker pool stopped.
    Oct 27, 2020 6:47:52 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-26_23_45_17-11859952539683990612 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2eb51036-a8a9-46f7-ba0c-b855388d955f and timestamp: 2020-10-27T06:47:52.059000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.781

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 27, 2020 6:47:52 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 48.906 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 37s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/ceiktxxnsz3n6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1164

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1164/display/redirect?page=changes>

Changes:

[noreply] Update beam-2.24.0.md

[zyichi] [BEAM-11113] Switch default pickler compressor back to zlib for Coders

[Kenneth Knowles] Enable JSON_EXTRACT and TO_JSON_STRING in ZetaSQL


------------------------------------------
[...truncated 274.56 KB...]
    INFO: BigQuery method is set to: DEFAULT
    Oct 27, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 27, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 27, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 27, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 27, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 27, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 27, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 27, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 27, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 27, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 27, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 27, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-09VcIzYqxpvclaLRP_OBhtvR_xd5uUiP8JqF-3VIKSg.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Z6pfSa1APLAnROgJT6AXRYwpxuohLuDKKRLJxIPvGI4.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8694297664157114467.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-O4LweKn2YmYziz_wZZgbSPKnyV-WrAs6-ddebQDpzLA.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-GzoNO1owHhv71oMaCa6ubyo3NiOblNefslljVz7Vjoo.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-5RO7ItXL0qjag3FXZ9xKQhTcNVksajwsPDsbKvcXfwg.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-CN6pyITDrF-dRrPu0Y80JY3TzgOIn9OegrgFURh_yys.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-wz2RLXKhc_BrePsZWcdBC63OeQziTyB-WYLz3Pcvx38.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-UvbF4WabBOgCV0yL_pSRkf1DXPSL4hd-A8Sg9XLnKzk.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-ctl2dh_7ardm0ViTZt1WtZZcK7wxRmJiHRvWSFuPMVA.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-prHZMalgtIZUcTs6bbQJHv4ykZdipjyFwPjXphNPBvk.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-Kswxx92jm3mUcqD_nxDAFgiBMZ936xY3E9op3CswKuY.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-tfKePVQgZKjLNsj9PwurCwKZHWRxK9_s9RlzdTaLWWw.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-rBklStTB74YqjvoekMxgSSfIzxlIGds4WcRfgowgWOk.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-U_aL5a9pcLxwBxCREhg6dHkIraZJd5OxEJ7qw-KLY7I.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-_OnDurydGNg1_alTOnXUCa08mJS1_dBWYrEUW75MThM.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-P3YdmdzeT1VBJIOfsiV39yblb0odE1e42rqvuzQoH64.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-5DibVwn95XtdcyQlCj-54yTLC3UrZlAPQLdIwgDIlqw.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-09VcIzYqxpvclaLRP_OBhtvR_xd5uUiP8JqF-3VIKSg.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-5d7039Mktl24UiGbvSO6FpY6p3w27EcEaxTveMd7_WM.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-kEMvCHt-VjJiSFXN1WQ19g8AhCDblkmCMmzphR_ZvL4.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-YE9ncu_KlZSHPfz3b_LNuGOMJhwWpdt5bGCX0KAxmSo.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-9syY1zS3m_yhxKIsY5XoKSjGi8MdVu6dnFSyxLSLNHg.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-XUKoTLG87TxwPPepDHmw-g22oLjsU5oLOt2S3POahsM.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-q46ENtXpoUD_ZqHgM4Bn6B_vtQy0Tf4tyFNezD7eH0U.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-K1AkB_X3AbROclAD3VCvp_cGTWDaJr4poaJlqrOCwxM.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-U-_eRx0zLOE78WLM9UI9eL7uvNrsnA-lEearNibAs8k.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-g7uBdQhR7W9_qu4HrzDDxN2Kz-otntTHQfECt8aizUw.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-45E6OMSwGH4OP0VkMu4ZHCxSoBPosTUHXAHjHWegbVE.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-lYq3HURJiIk-Z56MHWD-kcoaww4m5stUTQiYpAGtHtE.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-VDFUTNx4tOx1A10nN2MV-SMfaaIKNhr_UkROJXBLkv8.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-TWrFU7eeXBS6Hgk8RU0bvv_7yPjPH5X0RKErEr9gzdM.jar
    Oct 27, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-GC4SAAyBKBFCPAYVScDZhkn_f2oYQBJwxJxeupvnhE8.jar
    Oct 27, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 31 files newly uploaded in 1 seconds
    Oct 27, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 27, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 27, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 27, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 27, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 27, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 968034b754749af61220664e0db0bd3ff0ef05a534c04e17c4625e88bae98e42> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-loA0t1R0mvYSIGZODbC9P_DvBaU0wE4XxGJeiLrpjkI.pb
    Oct 27, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 27, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-26_17_45_43-16769279887901896870?project=apache-beam-testing
    Oct 27, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-26_17_45_43-16769279887901896870
    Oct 27, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-26_17_45_43-16769279887901896870
    Oct 27, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-27T00:45:43.661Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 27, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:45:51.432Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 27, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:45:52.149Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 27, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:45:52.185Z: Expanding GroupByKey operations into optimizable parts.
    Oct 27, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:45:52.216Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 27, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:45:52.282Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 27, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:45:52.315Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 27, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:45:52.338Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 27, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:45:52.384Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 27, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:45:52.743Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 27, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:45:52.811Z: Starting 5 workers in us-central1-f...
    Oct 27, 2020 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:46:17.138Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 27, 2020 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:46:17.198Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 27, 2020 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:46:17.241Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 27, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:46:22.429Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 27, 2020 12:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:46:39.862Z: Workers have started successfully.
    Oct 27, 2020 12:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:46:39.884Z: Workers have started successfully.
    Oct 27, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:47:09.132Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 27, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:47:09.260Z: Cleaning up.
    Oct 27, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:47:09.342Z: Stopping worker pool...
    Oct 27, 2020 12:48:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:48:10.228Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 27, 2020 12:48:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-27T00:48:10.280Z: Worker pool stopped.
    Oct 27, 2020 12:48:16 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-26_17_45_43-16769279887901896870 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7d5d3b0e-4f51-458b-ab15-c9f8f1c0cd08 and timestamp: 2020-10-27T00:48:16.109000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.603

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 27, 2020 12:48:16 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 47.709 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 59s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/bdcrturi46xco

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1163

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1163/display/redirect?page=changes>

Changes:

[piotr.szuberski] [BEAM-11058] Enable HadoopFormatIOElasticIT on Java PostCommit

[noreply] Fix URL to BEAM-9615 (#13190)

[Robin Qiu] Move Beam 2.25.0 release blog to the correct directory


------------------------------------------
[...truncated 272.15 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 26, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 26, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 26, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Inv1WrTb5q6k7cAoJqBdnKjYy-HsI43KFjt2xkoQ6AE.jar
    Oct 26, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-UElGbx_9B4LvfsZC4odPNGA4Wq3qPNB4FW84gZ0iGjY.jar
    Oct 26, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-LXJtG02q4SkTfhCugUBVvxKO_jWSgXkfSX3Go7vxu3Y.jar
    Oct 26, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-k1fvI2284fxHu2TxEy3tAil4eTj6A9t_naJ_SYrt5Sg.jar
    Oct 26, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-HBUsjmTPksvuCZE8VYtgtMaBRrI7LxXe2p7n_9M7kDc.jar
    Oct 26, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-8uT5st1cHqYXXDFlMAdGliZfcm-r_p0tykhH-eb3F60.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-HbtUz-3lJ-A5H07LwrgVTZcAwSSrnaNqB6ZMTWzuw2E.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-KY26i88Lp51SMZE92NeGIpy-7gKFALiAjr_DLi3dCIQ.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3875604397199432076.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-KoE49KOXNU_qxa4R3e9oeCgBzEkNBAkihYxbjhkswp4.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-oG9rhr5QC-_nJR1G6vyPWVz50yFS0-vpxgLCOpEvLtk.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-mIRwJgqC8fJs8mHII3wLcMG94nmPs0I_YChb9TrkoMc.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-y5bqf2rBRyKu34D33SLzTljhewWHq9frtSOwGDo0NOM.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-jt0db25kcEe6Szx7iv8coPpT06hwlHSZWV8mXSZRG4U.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-kl243QFLIBH8X3jJ3KeuUJ5MlkytaavT_B39ePs54Ms.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-4CNQ5PHXptSKrYpdQQNEsfzvr0L8QLPzjEBRlyyY9NM.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT--LCiiCxoAk8Rn59K69VU4aRK5rXnZgct706_xVdYqdQ.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Inv1WrTb5q6k7cAoJqBdnKjYy-HsI43KFjt2xkoQ6AE.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-TFZ82QSyhjLAZOR6JRdT57MZa5DxToVnJQSTHj6tEpE.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-CUAf_cVH7esIdB0D2_KQX_SG9x2r_M1ihGOIs1fn0-o.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-WjC_e2fgS_NQn0Q5meXQcGj8qEUvEIdFrPCHVsHracs.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-QpliDYdk0z3yRnXZ0w8Z9VWZb_1i5G-aMYC9mM2GZ3k.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-gCn2Re2Mj9RiD69AHvnYsqnLt-nvtT8hmcz3Fs7_y40.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-ZbHSKw5PQ8Z-nv07bCTwDeqT28_Go0dkslAM4HitNy8.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-AXt2pCovJ1MoXT7jRS8xnLJewl7VtPI0Wvxf9ejPeC0.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-Gy1xkdi5-IPMJEaxz52jVceqW-FyKpCD3QGNK-hw2Co.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-4oRt0owRbVr1ZbFahNEkP0iN2R4IwfN5Z-mEFBjAqSY.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-5RGL6Yb7n0WNfJeoaJVXng5YNM1nTpHtAMMEcw5ixZ0.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-IiTEj_Z7rSlzwLwHJtdir1PC1KjwckledWJSZ6XhdS8.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-punZzhOdXv8cuZKcn3qWoYru68CX07XqxvZ1dq4V1sE.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-UPIbVdYm3aXG2Ed5NfS0w-tCZu8LFWZNhDRVysr1hmk.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-7Zr8EPkjHQ-CfRdQzKm1tLJEDUR9CNwwVeNfPB9Nqyw.jar
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Oct 26, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 7b3f213c847bd0ca59e3d649f930a5c3ea89ec6528af4b2c0bd97b0f834a2636> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ez8hPIR70MpZ49ZJ-TClw-qJ7GUor0ssC9l7D4NKJjY.pb
    Oct 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-26_11_45_25-16129640501559923399?project=apache-beam-testing
    Oct 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-26_11_45_25-16129640501559923399
    Oct 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-26_11_45_25-16129640501559923399
    Oct 26, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-26T18:45:25.440Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 26, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:45:31.636Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 26, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:45:32.605Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 26, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:45:32.635Z: Expanding GroupByKey operations into optimizable parts.
    Oct 26, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:45:32.774Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 26, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:45:32.853Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 26, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:45:32.882Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 26, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:45:32.918Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 26, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:45:32.953Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 26, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:45:33.296Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 26, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:45:33.365Z: Starting 5 workers in us-central1-b...
    Oct 26, 2020 6:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:46:03.012Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 26, 2020 6:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:46:15.033Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 26, 2020 6:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:46:15.069Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 26, 2020 6:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:46:20.348Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 26, 2020 6:46:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:46:41.877Z: Workers have started successfully.
    Oct 26, 2020 6:46:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:46:41.909Z: Workers have started successfully.
    Oct 26, 2020 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:47:16.378Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 26, 2020 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:47:16.516Z: Cleaning up.
    Oct 26, 2020 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:47:16.604Z: Stopping worker pool...
    Oct 26, 2020 6:48:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:48:07.895Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 26, 2020 6:48:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T18:48:07.943Z: Worker pool stopped.
    Oct 26, 2020 6:48:13 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-26_11_45_25-16129640501559923399 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 656f294b-4f73-4bb2-9c35-ba1216dc7858 and timestamp: 2020-10-26T18:48:13.754000000Z:
                     Metric:                    Value:
                   read_time                     14.88
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 26, 2020 6:48:14 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 3 mins 2.927 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 56s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/o6ohv535yxitu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1162

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1162/display/redirect>

Changes:


------------------------------------------
[...truncated 271.51 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 26, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 26, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 26, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 26, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 26, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 26, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 26, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 26, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 26, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 26, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 26, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 26, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-QAInhOtKrWqohiEe-79tomzbFXEEEmaGxJzpKFujhAI.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-T_gacjBi9R-33BCWp2IPRcMT_94YdrRrA8NGp6qGcB4.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-ekVEWXPQcUdamD7o237p5iYVbSJ1l3Ah8nC1pLJXlv0.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-d0F7PscdoJ1NJVKl5LPmv5ZNeOgwCtianY5sQvsBZ_Y.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-jbLd7-4n54jO5PQdVCCYKX6UmcZwLhM5fFCHRJ_VB9o.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-kYFD2p47GoAlUT-FGRGrWPT32TT3Z1KGTohoW9D85oM.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-OvGzsuCvtn1nKB6IcA3feoRZ9NLcX8CzqnWF4oTolx0.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-Uts7hx45TP5psNQzgpwJJD2FacNxyJdVqXJMwGHEvck.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3986401515491233646.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-59M8Lx4aaTm4Hs1V3Twm4l2xAYuesOAVcXYn3ttNnew.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-w1IwogJH5TvwO6nSlpH7AVRQJLiyp1uw0GkeJIkbWw8.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-_f3s8khx-EkqbEGRT-Akna5V3x8Fdz_5xW7C2Jge_5w.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-0ygiKn5yrc2uURBCkHXNex_5te3vPT9quaES4sYbq3I.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-_kFEtdl35YFkCHhso3uwlcr1eKHlEnEjy_Qa3sYaeQg.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-87kKruU2JDyAiG5MYrblVaLs4EQpYBeSCRfkPFeAOlY.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-gn6-JQ2Zzdatv80JQyhdLUdVyo0OboFpxwDpWEiNGqU.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-U9K38EQubZs2zC8n86IRBSeJBzvQ3FPZCCJ0Vllfhhw.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-QAInhOtKrWqohiEe-79tomzbFXEEEmaGxJzpKFujhAI.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-V33mh047eI1GaOofSwwfIdNEt5l6g-WwjBHdQJUjZH4.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-Rtb8v2_sGY_7xmkQwgdAFDaGWeBmERXddMFk9qgPrdQ.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-mDpD-cFla1uPSi0YO6di9SUSZFWqqTEO-zw-PFsUC7w.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-if4tN-9wPvnad8wW9myZFDHIcoppEXWsrPNUAGuHT58.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT--EM7otbVRu1xUf8T_Jts93asdWrgPOXptiZdLXdx0SY.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-nmIcdwejsjFP_69pLhiawYdsTxw2XFCvwvedetqxhGg.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-gxmOOiruHMI4l1HbSLEAp9xAe-cELAdTChvGqUeyvF8.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-9fZZhcuxKi80OPPyy35kyH4ROcaktzhKO6WbXN_3aDk.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-kZm8zhwkuNY3G0wRfuc4FDAe9HnCktax8crI-KgbyLo.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-DMA9WnwqxLPv47kcF_21Y2akCgPvYPumwv7yYnncpJs.jar
    Oct 26, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-O_B1fa9pr69up6mSrj4jbDCMVjzRsLyYNb5e08jB24Y.jar
    Oct 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-OrDpqrFa7FsX786F_yg9OrA7QTnan2Yq4bt_AP9nt9U.jar
    Oct 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-vRqkuZhoNxyxmyWiPSlKssFWu4gRE5y71DDkvwQJcn8.jar
    Oct 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-RsQKT9b5On1RPclyyxdH7TkKkjXwGxZiKWCr8QGkDlA.jar
    Oct 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Oct 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 865cb626bbce75c39f6dfd9e80deda7f77f6251bf20076a544192779912e524b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hly2JrvOdcOfbf2egN7af3f2JRvyAHalRBkneZEuUks.pb
    Oct 26, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 26, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-26_05_45_16-4984456813355991956?project=apache-beam-testing
    Oct 26, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-26_05_45_16-4984456813355991956
    Oct 26, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-26_05_45_16-4984456813355991956
    Oct 26, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-26T12:45:16.972Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 26, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:45:24.969Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 26, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:45:25.728Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 26, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:45:25.762Z: Expanding GroupByKey operations into optimizable parts.
    Oct 26, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:45:25.801Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 26, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:45:25.862Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 26, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:45:25.891Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 26, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:45:26.025Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 26, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:45:26.061Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 26, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:45:26.572Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 26, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:45:26.654Z: Starting 5 workers in us-central1-f...
    Oct 26, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:45:54.143Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 26, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:45:55.378Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 26, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:46:12.626Z: Workers have started successfully.
    Oct 26, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:46:12.660Z: Workers have started successfully.
    Oct 26, 2020 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:46:43.038Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 26, 2020 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:46:43.202Z: Cleaning up.
    Oct 26, 2020 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:46:43.357Z: Stopping worker pool...
    Oct 26, 2020 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:47:27.959Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 26, 2020 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T12:47:28.003Z: Worker pool stopped.
    Oct 26, 2020 12:47:33 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-26_05_45_16-4984456813355991956 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0b80f8f0-079d-4cb0-bc3a-022f65ded4ee and timestamp: 2020-10-26T12:47:33.235000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     11.09

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 26, 2020 12:47:33 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 29.811 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 16s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/2h3cc64n22d4w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1161

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1161/display/redirect>

Changes:


------------------------------------------
[...truncated 271.93 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 26, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 26, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 26, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-yKw0-Te7nilUbJfqzKh-_MC5VHCuXhqRET9vgfWy_70.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5327488097214546970.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-hPRo1E3iObfZ8IJaKlAtEYZtQ7_Czz52Pz6hjtlvxz4.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-01O1LMPKdjWTKk9s5Go5rEv29GoOr2wY8ipdph6raVM.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-XX9UBzf2E2swv9NygW7TgTmGiz_j-5lzQZ5PBwv0vCM.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-0jKpVAkQpliEgCl_g9yzgVdZhITgG2Yjde74Xm36fZU.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-AA7xNLDKuPmgeWznz7dXksJB3Efq8pS73D707FlTZK0.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-HKKtmy_S4SilZLZJlU-7qDF_A2ERHrxobVha5Kb6Tcc.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-u-f-CVuiZ4994NIF4TPD3KDo1ug7sEi6U4NtjrZiTgI.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-aJYCCvSnNeX997b88BWO-9sl_AuolmihVSz-n112Dlk.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-EemFd2z2XLXKQAbbtZfilfEEM9ua3fNzBT2C6XKVNWM.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-Mxgt0zR0fFEfIlOweCQEC9tQtCTgsL8B-fZXPA36MO4.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-iXC_t-_a9mImFBF9IdvyWbvVVWICPfL85Pr8Kne__bA.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-V6GO30bJfijzIIMfMud0bD96xImMVn2eFhsWjvAI0UU.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-pqmbu20ejPQQeNlVADakxYYnfWeYvkEQGC9Yt8iuv64.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-jRboYXx31cQ8GPDonCgbujjgZmocKG5-RXtmR5fG5Pg.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-lfTVvdjeVOZtCHwe0RquOMbonSYp1tgRrG0CVsH5J3M.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-yKw0-Te7nilUbJfqzKh-_MC5VHCuXhqRET9vgfWy_70.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Wls3ULBP4g6VCH5TbnVMgNRI-pIy7PHWA94tQe0TUpA.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-qzSebLxLLoleFkPYcK6tNVvr_T5UZtM1Cy59J6kT7n4.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT--Fvy9KICg63Q6lzfMvs4aoia1cLgATn14cVb59tolgw.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-w4WHL0l_BscMZlZ-uRa8PXoyEFlVw9jXKG39WNmUtfI.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-ax6WQGexCkVU-XQv2QzBceEw9OLi0wsu9SX8gsiYP6I.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-OOJauaJuVWUu8_wAcHAmZCADt4NeI9WUNaakpmF9dqI.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-nVrJszIAOGGB_xpp5IGK5OFXz07k27w187MQAxpbcos.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-ko1dT_DNN6pITFjN1SvmX3MVaOeVHDs2siWcmAiIehA.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-LVnJuxLKuulDKinin7NMszkwMT3Nq3nKj5LysEAuahs.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-KqTmKroS7CmpT1mfgGuQNJgkpkAursvaRZIUhvUSvfg.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-N7zHg5UBG5KqJ_w3ZQBXJqUBUu05Q_fy5HW27_fxoDI.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-jkxvHgKUGIysP74xPIm-VMzMA8WbTx6KhPVN8z-AYqU.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-7_x10xtAayCu6LyFpS1zlhi5zhqWXtV1zdkkA8Lg7zE.jar
    Oct 26, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-tIprXJv9o-zEJtOdaQDY-Q8c9DNDg9rif4a4aWTolvQ.jar
    Oct 26, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 26, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 26, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 26, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 26, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 26, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 26, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 71e60e82f2e4c88629dd259ede4d8c2a14634bdf24aa1078f2a6654c1caf134e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ceYOgvLkyIYp3SWe3k2MKhRjS98kqhB48qZlTByvE04.pb
    Oct 26, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 26, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-25_23_45_16-16818571772244107067?project=apache-beam-testing
    Oct 26, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-25_23_45_16-16818571772244107067
    Oct 26, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-25_23_45_16-16818571772244107067
    Oct 26, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-26T06:45:16.728Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 26, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:45:24.208Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 26, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:45:24.808Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 26, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:45:24.848Z: Expanding GroupByKey operations into optimizable parts.
    Oct 26, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:45:24.875Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 26, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:45:24.941Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 26, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:45:24.968Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 26, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:45:25.004Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 26, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:45:25.036Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 26, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:45:25.456Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 26, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:45:25.524Z: Starting 5 workers in us-central1-f...
    Oct 26, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:45:34.266Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 26, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:45:53.778Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 26, 2020 6:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:46:09.836Z: Workers have started successfully.
    Oct 26, 2020 6:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:46:09.869Z: Workers have started successfully.
    Oct 26, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:46:42.720Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 26, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:46:42.850Z: Cleaning up.
    Oct 26, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:46:42.925Z: Stopping worker pool...
    Oct 26, 2020 6:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:47:32.975Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 26, 2020 6:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T06:47:33.021Z: Worker pool stopped.
    Oct 26, 2020 6:47:38 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-25_23_45_16-16818571772244107067 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 99e37259-b658-46fb-90eb-5f3c32f583cd and timestamp: 2020-10-26T06:47:38.700000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.849

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 26, 2020 6:47:39 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 36.071 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 22s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/cpgzzdbg2bmn4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1160

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1160/display/redirect>

Changes:


------------------------------------------
[...truncated 272.52 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 26, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 26, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-u1L2hZYeohMV3Khga5bRTHwKmfH2VzPTKy-kcGmupNE.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-gwNgNswpuE1dxTKjpv5tJvMOWtv4Yc-V6-92khCJfRs.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-OGMNHeIIewQNijkw4yES8e_ntkUYIeF-mBeC6rZIYXE.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-UAe_2QbNlsLT0us9D_gv3W3YlmQoR9fOrM8c0CzPC6g.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-ZHsD1H3kamCxunxWedTy8aA6d2-9G63VoFUYeI_zphU.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-5MymuumNc9QOiFWWjv-6xtM5pRyMEWy-2Qbsive5eHg.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-IK34RWBVsBwrf0mBlafFcvPPGar3xdCNWhLl4BXR1pI.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-uTLLWTY0GRSCQnY-BT_Ec_iHa36T9xeqn9LeF2r1vT0.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-P7-FPR0zD4_yHcIkVSvw7ipcI34de4qlg-s-3Ld95-Q.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-VQ_SZrZENCFTnGGo_ypOsZIdb1Q7bYhh64NZSBwEvTU.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-j1d_VVWDoGU0kTLnSngE8SrmRnxE491UAnTR9H8ADIM.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-55USITVSmpzX2HnHRxNZitO-rbjVbk8TQAbDeUlYn68.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-lBPLIj3oy37fGq1YOchbm8TXRc4WcT-59U0w2oTK1sk.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2766508350152748043.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-moI1zcEgPGNY1gp5ZtKdkWuCsF2QJ0sKHFwQHP156Yk.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-jlvOeafEF25xG-Lm9NTwOZnKYOsYvpv85msqrHFbLxE.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-DMwkeDlb_83rpl_Jg8UItW4VSDRjrTmkUwp_dJm3wGc.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-u1L2hZYeohMV3Khga5bRTHwKmfH2VzPTKy-kcGmupNE.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-tk0Ss3iz8uKnYbCsbvsN-5oKegPCIYasCk2HzZP8X8o.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-bkGamGA5l6X4-9e_PAsx4q2HVIPjAJ3ExLUuNy6aZrg.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-nJlA8c0jHojzDnt7B0vzL3Doyjpwbv1RrpvcEqwGo_Y.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-Dg2e9pd8Ynk4pao3mQMhYVULT2_DMtKKT2rG4N_0R0U.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-sDz5Q0GOYa8FCq3sn-5uG9pKzfelOnLhLTH-ejJhZgg.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-VOTgiZb6ygB9JyWThzWWUV4iBC8nYMRhSHNZse3jMnU.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-S6Y9npZvD_BH_OEhxCMaEE95Lk5VwhoOT_4ioxZ-P5Q.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-uIRuQnnMwIj9C72ZcWvPewYO_57nno2wXWSlahHYcJc.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-Pfv_sIwxiHa_0U9ToE8aubEbBBtwwXINplRJ97H1vKk.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-eoqeNYCUmPXUUy4x2G6EhVU_BAdpXRx0nIWphh42pSw.jar
    Oct 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-RY5jKIggnZN9YttwibSSg7JL5fCM6d5rueDn6A1Vd1o.jar
    Oct 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-b-RJiczQlLvolxPUkqtqkHlij_XmNAJjLngF9XHgQ8Y.jar
    Oct 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-YCYUOvSTNws5hrr9GKc8PnK_SAvHm7_Rs_JEf8czzUc.jar
    Oct 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-nn-gpq0OIRZTuRODmmmaJwLsLVy1z52WIMe12yRTFGg.jar
    Oct 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 804326470d48684a65a9d2b71f976d8bdaad415b7ae4fc26cf869d395ba9e2cc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-gEMmRw1IaEplqdK3H5dti9qtQVt65Pwmz4adOVup4sw.pb
    Oct 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 26, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-25_17_45_20-446657745001538622?project=apache-beam-testing
    Oct 26, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-25_17_45_20-446657745001538622
    Oct 26, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-25_17_45_20-446657745001538622
    Oct 26, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-26T00:45:20.060Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:45:26.206Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:45:26.803Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:45:26.920Z: Expanding GroupByKey operations into optimizable parts.
    Oct 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:45:26.950Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:45:27.014Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:45:27.066Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:45:27.099Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:45:27.137Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:45:27.503Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:45:27.582Z: Starting 5 workers in us-central1-f...
    Oct 26, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:45:57.691Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 26, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:46:04.720Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 26, 2020 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:46:04.758Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 26, 2020 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:46:26.146Z: Workers have started successfully.
    Oct 26, 2020 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:46:26.179Z: Workers have started successfully.
    Oct 26, 2020 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:46:51.412Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 26, 2020 12:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:46:57.104Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 26, 2020 12:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:46:57.219Z: Cleaning up.
    Oct 26, 2020 12:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:46:57.317Z: Stopping worker pool...
    Oct 26, 2020 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:47:52.847Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 26, 2020 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-26T00:47:52.895Z: Worker pool stopped.
    Oct 26, 2020 12:47:58 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-25_17_45_20-446657745001538622 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2bf15f77-0119-437b-bebd-6be6ec69313b and timestamp: 2020-10-26T00:47:58.778000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.635

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 26, 2020 12:47:59 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 52.964 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 42s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/walt6xaz66re4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1159

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1159/display/redirect>

Changes:


------------------------------------------
[...truncated 271.98 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 25, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 25, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 25, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 25, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 25, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 25, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 25, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 25, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 25, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 25, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 25, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 25, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-f6hh7NMvZslivL9kzZxRojWGJexbrrmdvgO4M340OiE.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-oJGS8lJIVSYwwVD8p6CVqh_Cmm6QcCUA5og3KMqowh0.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-MYHPfnrEFso2J0TL2ml2BTQ8smtuomn7HmT9P9pkREA.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-4YOPVNqSaGXHyHO6QZ6oRY4fEXvl-otBdLO0uqcLMpw.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-BQlgbEwF-JAenaoA36p6gUqRR7iwmm_wWm43lE_CNv8.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-fv7mehVKSA4BKnNTso7by9fKJm1VZHB5YqjRxuAjkrk.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-LvaKaf__kWxHvjm32QQM79c73dxhIC2HrEFZls_7Iss.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-zbvItM68GksmGz0WHN9qSvNupEurpeyL9lpjtjYj3qk.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3848369045722569544.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-1xsLCH5_3nZvewx68EEr_tuOWlXVZn3f96Pm_vOfdrg.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-TQ5gIYf4906QuzYAFzAfWGNaEIs3SXU9q3ArCKbzvoE.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-AkairdWhLXmvK4bdhCI4kv6CFD6vkSPIux3wZqPRvb8.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Yd3UAni_V9Llroq1MfGobcTrvglgiPCvYxHM7ZxvxYM.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-b8eBVVPdFTLvHsK_-wiwvlfB7e0u9NZEDkoMqOLoJr4.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-lyZg8ERxhGK4oygVMXY6NtXoDngN128yMj_x4k4BMIo.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-uU4GaVcq3Mj9pN7l0GkJpJWPPuV0nAJoKgPN-jtAgb4.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-B5pKbtoqruvJPsMCXRmL6ul1MnpVy9-Ko4TqgQUCZOk.jar
    Oct 25, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-NOskx7iSLFsdDBd-YbvJ1wQtmG301kt5ycFxglJ-3HQ.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-igibO9OXjU0UtoUAteUNADIXPKdK41AXjyl9Fw0OnXo.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-0totHVJgmEL-GbrymneUW7aZN8iXCdV3nXH7riSf8Es.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-bLW3PN4C0z2DcKcyLtdzMvmsyPwBQPQagHvB3Q4MZUA.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-fcAOIXG51Q6jo6CNwGmhVLfrZFtAeOIr4QLKN0owk6g.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-MYkWIlwJxpcMhJ0NBruWMSiopT7TXUtrIbYJgcz83H0.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-nQSOFyV0liDPtHHGMrXkOuxOOIanMYG3FKfoZ6cph2g.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-nR01c0bi-yPudmDY3KgNf1xSzN_iG4TzlrEljbHJ8i0.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-p8F0LZeSmXaXj2_z1nwQLP4Tc2-__La16W_jjtO1e4g.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-oLKayx0Zd7a2FpwrVgxu3ES61JSIxJ0qmkdyrcm8Xig.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-HFhp0Gr8_GFPvX-QtEW6b5JzVFye-TYJfMno8eTunE0.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-MYHPfnrEFso2J0TL2ml2BTQ8smtuomn7HmT9P9pkREA.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-syA-BEsPyxoifQGzPzgo3teGFxtMJmBXyeGQEtALmas.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-d_AdJ77he2hDLb6gyyVcZBGQzcr1lgIkiWfJGM2IHmY.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-bGTUmodn4sk8rmAVBPAHX2XYqbNDzTmVuwQzFIoUNYU.jar
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 16c471b561856f408c43f0b41877579539c001b22500504a2b530a59afd09788> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-FsRxtWGFb0CMQ_C0GHdXlTnAAbIlAFBKK1MKWa_Ql4g.pb
    Oct 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 25, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-25_11_45_20-17126885541087710338?project=apache-beam-testing
    Oct 25, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-25_11_45_20-17126885541087710338
    Oct 25, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-25_11_45_20-17126885541087710338
    Oct 25, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-25T18:45:20.264Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 25, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:45:27.485Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 25, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:45:28.110Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 25, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:45:28.148Z: Expanding GroupByKey operations into optimizable parts.
    Oct 25, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:45:28.179Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 25, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:45:28.251Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 25, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:45:28.280Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 25, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:45:28.312Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 25, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:45:28.344Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 25, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:45:28.782Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 25, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:45:28.845Z: Starting 5 workers in us-central1-b...
    Oct 25, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:45:38.619Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 25, 2020 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:45:54.662Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 25, 2020 6:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:46:14.048Z: Workers have started successfully.
    Oct 25, 2020 6:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:46:14.080Z: Workers have started successfully.
    Oct 25, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:46:45.699Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 25, 2020 6:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:46:45.844Z: Cleaning up.
    Oct 25, 2020 6:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:46:45.921Z: Stopping worker pool...
    Oct 25, 2020 6:47:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:47:37.513Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 25, 2020 6:47:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T18:47:37.548Z: Worker pool stopped.
    Oct 25, 2020 6:47:44 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-25_11_45_20-17126885541087710338 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): faf5b696-9929-401a-8bd2-81f9a84586ac and timestamp: 2020-10-25T18:47:44.216000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.128

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 25, 2020 6:47:44 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 39.956 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 28s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/c4hborqtosfrw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1158

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1158/display/redirect>

Changes:


------------------------------------------
[...truncated 271.64 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 25, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 25, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 25, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 25, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 25, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 25, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 25, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 25, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 25, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 25, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 25, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 25, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-4M1DOzLe0rG5DtqOVxFEVdvXNCqtvtKiS-zT74TymlE.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-SWxAD5-2lHAH_v2QRx2la6J6fpEkLzt7KjAdGRR3T3c.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-3rr3dv1WtEF3iAGOfsoRagNXDLdx_JYsaIzytqbserE.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-BkMJ8a1IQRQx3ywpCZ8i3dpYMRHHxxXSIynWD_hm7OE.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-Nrz8VgxHPSRUETCIrm0w817bS_JpSiHCkxaQgyvxWVw.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-3uFV5gqM4s-wbJEJKbk93JiWn-7MC4cN1bZwP_Iru4M.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-4xWRM3oymSjlH37S1oi4XiVyWPO9t0FP-Y5oeyYwrBM.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-vVitMEV80rFWHNrm9gFZig1MKqBkyjDHi3rYPws8kXk.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-qw5epcD9ojhA7fTYD1l_g1AkXntn1CkZQ-q3xz7frZ0.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-0wgfr7T125LIGLkR8jMoCzUku06CoXGzVEZAJBtkV24.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7319242560750314634.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-uuj3tyvv-a4hq0rITLbOYtBt8FXuUOqbBv8ZYXiOKgA.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-reTKJvFS7MZBFaQI8WFnOM_DjdlaQenH_c7eu-KaJTc.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-cmWz9IfW9iBMtbdcyBz_zHSJz8rcIA2x_zddx0Jvi2M.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-LC4TyB3kq4BvvfxTMyz7AStnfrk1wD7wU8etBG3C9EU.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-jfnRixoCedpfFlYt4GVXUcAWjNsI4DFpiSO3__NSA-o.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-YdPxekgF6f97En4232XNq9id8kD3gPD6Bj5faMMy_6Y.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-yT7ZHMmisqReZ9OmGLPXL4tBllcO29OKljnNClHnIZ8.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-xZGLQf1s3B2tYoy_7upD3APyN0HK3NWUJNBTkmZX8lM.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-89p0I2ZtKgOu3ScbsVK_u8l4DzegAnWG3dmw-4D2zlU.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-ans7nrMCLTMyFnnEuvpXsKq2MX4KljjoaJxxos1IEnU.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-aoF1cNdugeG00mOEDnOFIU7C5JpmmJ0ncn_aAjPN9U4.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-1fRD58xNYNiB0Vt3oZXi_VbLmomodNRteJroosN9Guc.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-W9WS2Fw3dY_5brviX4QQB-OCPW6sDPg4com75pQR_yw.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-5r3hgMdTlmsjJCrpy2Ibw-7SQEMnQcb8Zc7peG53-gE.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-dssKTkcU_JqN2IKm8TlrQ7mUzy9YEafHBTl7IQNY89c.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-LC4TyB3kq4BvvfxTMyz7AStnfrk1wD7wU8etBG3C9EU.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-VFW7uRPS_YtUFaYjtv-VZzy6eOF4qzLaVSif8Bbs7IA.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-hzBEVkJ3cHMqurqB8fYKDkMc3iACAnXFKnqZKbq-MRI.jar
    Oct 25, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-RRKCFOB9MAhnLO05VxkH3z4n6y0O_UXef78LlbKuKGQ.jar
    Oct 25, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-9pJ4h54K_-Igzqo3jjZ5hhUVdKF0lPB4YKXDS_OqVaA.jar
    Oct 25, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-GDKr1HXZ19G2dbywVKF76XIuT8aREZzYTDBGDGGyG4A.jar
    Oct 25, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 25, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 25, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 25, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 25, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 25, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 25, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 18c2ca8d1565c43e78a24819b2543c274cb9ce55931479457db5c99c0e44ddb8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-GMLKjRVlxD54okgZslQ8J0y5zlWTFHlFfbXJnA5E3bg.pb
    Oct 25, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 25, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-25_05_45_16-418170915013169280?project=apache-beam-testing
    Oct 25, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-25_05_45_16-418170915013169280
    Oct 25, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-25_05_45_16-418170915013169280
    Oct 25, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-25T12:45:17.019Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 25, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:45:24.041Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 25, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:45:24.642Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 25, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:45:24.683Z: Expanding GroupByKey operations into optimizable parts.
    Oct 25, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:45:24.747Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 25, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:45:24.811Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 25, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:45:24.849Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 25, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:45:24.883Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 25, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:45:24.916Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 25, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:45:25.304Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 25, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:45:25.374Z: Starting 5 workers in us-central1-f...
    Oct 25, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:45:31.158Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 25, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:45:51.462Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 25, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:45:51.494Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 25, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:45:56.718Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 25, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:46:14.001Z: Workers have started successfully.
    Oct 25, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:46:14.064Z: Workers have started successfully.
    Oct 25, 2020 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:46:44.565Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 25, 2020 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:46:44.706Z: Cleaning up.
    Oct 25, 2020 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:46:44.787Z: Stopping worker pool...
    Oct 25, 2020 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:47:38.623Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 25, 2020 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T12:47:38.664Z: Worker pool stopped.
    Oct 25, 2020 12:47:45 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-25_05_45_16-418170915013169280 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): fd9749b5-09cb-4277-b86b-89d675c1403c and timestamp: 2020-10-25T12:47:45.264000000Z:
                     Metric:                    Value:
                   read_time                    12.572
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 25, 2020 12:47:45 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 41.838 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 28s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/ohpniqz74xoaa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1157

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1157/display/redirect>

Changes:


------------------------------------------
[...truncated 272.71 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 25, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 25, 2020 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-b6gTgNc3Vh8rehJHwqkwbKODZ-DskQXrWCATz8ezx5o.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-QwpWeV4OnynqvAVaZK7BEN5vEV095XGlwh62AqbVK6g.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-bCuYhcsIKPmazTgtO26bYEjuk0e6C7Z77BQv18Lvfw4.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8131455832328593733.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-LZCdOLyQ125totQGYDp5_zWqf3Tv_01W3cNkSPm18pw.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-PzpHg9RWMVjEDE5hJWqR_kx6tQoaJGb1Aw_SzUjXAH8.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-dvHuK8N6WpQaLspJ40dxWKjo_vshMDm3c60dhuCUlXs.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-GTmlNz1kfQ5mc_5jCRbHdqp3evXrYvI6LAKkfnFEPRM.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-lWbIZcbnWCqSXZl6LhqNTJM_uiAsXVbzFeEMLOrTnIY.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-Q7z7vKJ4aVd2xBvNpFJslMLGs-hHk4xIh99peINmwDg.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-7yPzFXv8Z0MpIw1mRJxVPQ5hcFxsOG3CfIUDzbkWZDg.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-rJYvCAMfoY9NHJcH33ZYkjf8gczLA_si2zmBLw7D4fk.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-kI6Z5LGsMZWVd2xLC5Kg8LZ2Ya0JwLINRf7Hgn3Bfj8.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-HDES6-KDDKDOFWGWMdtLtv9xvS0wnWBfWgOJ3Hvc7uc.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-km26SzhafUr4aD-ZyQr0fYvGhFlaO2PP5p-CzIX3IwQ.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-NVirnxNZOpIAHciBsWWadX1MoNgjcJQ2xJFuapDt9xE.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-2zWuia4A6q87V1YKiI7LODg58sbOXo77BEbhbErZ1p8.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-b6gTgNc3Vh8rehJHwqkwbKODZ-DskQXrWCATz8ezx5o.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-2Zn2BGE21w-mDK64PFg8MMM33Oht3G1B_Fl7BJONCdY.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-HjlkLvw-4aoCaJ_PkIHlXxHBKfStON1IiwzG1KDpx4s.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-PehHGeAS3vRcjjTWBEJ8ksuXFT-G7ygH9IKzOlByJng.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-L6ejUOsS7AV1IMiB_yACvplBPt_Y5vEnVNO-agj4rts.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-PFo9R1QraWF9rDZ9rHmlXitJlIdw7CbzBP5EDPa79n0.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-1RiCvhKQPjtW-CwxySgNV1Xvl01YDWewUCv75JFjXh4.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-ET2gOBqNVMfSg2TXvtKtSd4VsFvZ4CULj0QjBUZsUOA.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-DzfQgbd-wEyrSefRrh2Q-HEeONqLD-qT47V5KsASPhQ.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-aNEe4Qx8Dh0ykp-1ls2wr5lFV46ilGCHEXSgRQc5QoA.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-Gr1oOyKRqw-i0_9GPS-hu833NURT9OdAknd028SsCWE.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-AoDI08at5AzM57nMWqZZxEtre4urSxgcx8ROhrYG7Gk.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-GOG4hLglzfY3VPQ13ibayyBSQUOTnQdCI9EvzugrI0U.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-9nO9Ix49oEG-EMbWLdaXfSahQX-GPTFYAcMRM0P-8Ec.jar
    Oct 25, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-Ppzbdea4MBF-LrRq6JdgiCZ8ViN86pEYSF74hB7v9o0.jar
    Oct 25, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Oct 25, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 25, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 25, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 25, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 25, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 25, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 134354d4723c1d4a884d75da0a901b331f6931aad19aaa86c1afa39768e85519> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-E0NU1HI8HUqITXXaCpAbMx9pMarRmqqGwa-jl2joVRk.pb
    Oct 25, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-24_23_45_15-13844997698167544185?project=apache-beam-testing
    Oct 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-24_23_45_15-13844997698167544185
    Oct 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-24_23_45_15-13844997698167544185
    Oct 25, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-25T06:45:15.638Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 25, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:45:23.364Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 25, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:45:24.082Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 25, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:45:24.123Z: Expanding GroupByKey operations into optimizable parts.
    Oct 25, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:45:24.149Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 25, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:45:24.218Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 25, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:45:24.244Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 25, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:45:24.270Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 25, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:45:24.330Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 25, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:45:24.853Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 25, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:45:24.934Z: Starting 5 workers in us-central1-f...
    Oct 25, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:45:43.646Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 25, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:45:53.083Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 25, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:46:12.726Z: Workers have started successfully.
    Oct 25, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:46:12.756Z: Workers have started successfully.
    Oct 25, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:46:45.953Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 25, 2020 6:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:46:46.082Z: Cleaning up.
    Oct 25, 2020 6:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:46:46.137Z: Stopping worker pool...
    Oct 25, 2020 6:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:47:47.102Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 25, 2020 6:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T06:47:47.130Z: Worker pool stopped.
    Oct 25, 2020 6:47:52 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-24_23_45_15-13844997698167544185 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 682cd20e-4f46-4454-bb28-4041466de482 and timestamp: 2020-10-25T06:47:52.968000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.783

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 25, 2020 6:47:53 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 51.202 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 37s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/phahjc67n2akk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1156

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1156/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Move @Nullable annotations adjacent to the type they annotate

[Kenneth Knowles] Suppress nullness warnings in files with nullness errors


------------------------------------------
[...truncated 277.95 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 25, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 25, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 25, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 25, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 25, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:151)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:494)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 25, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 25, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 25, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 25, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 25, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 25, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 25, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 25, 2020 12:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT--PPrXPywqwyBMSSvUVFEMvpePX8FU4A9VbQHakYe5z4.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-WyqBiLk1L_fHM4UKtNG_IOipWARWB-CK8WC9Hnrr7jo.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-WHqve_hyD_vECC1-n1fXiGndIC-te3CRsqNNnL4jcGs.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-akMpPunfihQBSipPbdgmUr1A3iMvdm0qx3EdjIXT5uU.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-W1AFSx6itQlo-pvtL6qWnkR49AgvaBYrgfqa_3P5l2M.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-A7Ke2pXbdg5M07zHoFzWeM4cBehwVUjHsALNBnHftAc.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-T_S5z-kJkTRqqhr1hqOjPSVIcqfTmaUq_7HvY0c5Lxk.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8616690392746134612.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-If9oOd3nu0AfloAUer8Y6Ljb3yyd0iNuXXV9r6eG98s.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-AmxJvUcWSa1clkLVlCSngig1H6SPrp6mndMcK0satyQ.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-tPXQZHPduTMskg9rZXJNc8B3p6ffclZGA20fLwMyiuc.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-eoeAXi-chcXB7kanlvepf13uF5reOQzfzS6LRhMqBlQ.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-j8vCOAZ1y-J5wVrBWPxS0eyzMM0-7bB2vHk4s8DRzR4.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-kxp96TkMzUHC3hKw5AbxehJNX7diKlIs90Dn2Z_DXkc.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-V7HSEIL6pNHwiUvMDbS0kD6ZCBiE_J-avIQLUMfqwi8.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-rfAuey87bKCxGOfTLHvvSoNlvX0DsEXg-y5F4C9O0ic.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-QKObKWZbgCbULCiogkfhG6xRY2HP9YJaLyPTXNxVGKk.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT--PPrXPywqwyBMSSvUVFEMvpePX8FU4A9VbQHakYe5z4.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-ldvpDtS7dcmEqjOLwcuWBL-Som2qX8xFt36bkzpDDXE.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-x4o3YolLEwxVQvkH8RLbVxXY36DyRfxaa3lyVJR6L0M.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-TWgAW2RqGnehOWiLzT7RMWG4tYZgUV6eR8MJ2qWE_s4.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-RJfdeESG2mUT27shPXKqALy09ZQffnkcRVbpCium0dg.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-zfxfyT5kvGOjSiKPZrsbmXylob7TuPKqj1XKWsUJTuw.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-W5iK4DfrJj6hmBLVhVgIzUIH2WGA9tNVuX9b7s3oHHA.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-_PyDyjVJgdy9DHlx0bjfhvkZXFOyxg8DDTASzRjakyU.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-WRI20ss-4Zod8RBHd1eXc8ePQFxwrNcA-v8Sgsz6bwQ.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-B65l7iV1z-08E46w-kdgMzjsnKiyrUaZ3GWu2XBLWEI.jar
    Oct 25, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-7PFEE6_7XCUBmQO0Mcdo2bKrv0mrlgY8pfBFSpm8hPg.jar
    Oct 25, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-3g7WA8eHMheIW5lBbyNRLm6_Ns_L2txEUA34kDpaXZc.jar
    Oct 25, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-SzpFdNDA-g_XMIlcLepzbjpq22IoZtslonG_mtugSgg.jar
    Oct 25, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-YWy7RC1mj17xpmBhl2czXvxfJDLAWBRkI2t3oY8DS_w.jar
    Oct 25, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-dUXtbx_czh-oECJ_62RqiU3y2eRtYuETcphf6OlPVKs.jar
    Oct 25, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 25, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 25, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 25, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 25, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 25, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 25, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 7ab7194796ac40d29a6518e8ea77d8e054ff28fd05a1506dcccf2c4531a34a85> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ercZR5asQNKaZRjo6nfY4FT_KP0FoVBtzM8sRTGjSoU.pb
    Oct 25, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 25, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-24_17_45_34-11066715899745502195?project=apache-beam-testing
    Oct 25, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-24_17_45_34-11066715899745502195
    Oct 25, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-24_17_45_34-11066715899745502195
    Oct 25, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-25T00:45:34.245Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 25, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:45:41.403Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 25, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:45:42.028Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 25, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:45:42.057Z: Expanding GroupByKey operations into optimizable parts.
    Oct 25, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:45:42.129Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 25, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:45:42.195Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 25, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:45:42.228Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 25, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:45:42.265Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 25, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:45:42.289Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 25, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:45:42.692Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 25, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:45:42.764Z: Starting 5 workers in us-central1-f...
    Oct 25, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:45:56.598Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 25, 2020 12:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:46:10.523Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 25, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:46:32.132Z: Workers have started successfully.
    Oct 25, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:46:32.159Z: Workers have started successfully.
    Oct 25, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:47:02.203Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 25, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:47:02.384Z: Cleaning up.
    Oct 25, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:47:02.470Z: Stopping worker pool...
    Oct 25, 2020 12:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:47:51.999Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 25, 2020 12:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-25T00:47:52.044Z: Worker pool stopped.
    Oct 25, 2020 12:47:57 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-24_17_45_34-11066715899745502195 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 70822f2b-b4ff-4813-9928-df51ca832d1b and timestamp: 2020-10-25T00:47:57.638000000Z:
                     Metric:                    Value:
                   read_time                    11.483
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 25, 2020 12:47:58 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 37.697 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 41s
107 actionable tasks: 72 executed, 35 from cache

Publishing build scan...
https://gradle.com/s/rq4w3plolgaji

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1155

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1155/display/redirect>

Changes:


------------------------------------------
[...truncated 271.97 KB...]
    Oct 24, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 24, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 24, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 24, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 24, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:150)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 24, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 24, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 24, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 24, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 24, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 24, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 24, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 24, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-V_gc2Sia7Zx7X291v5HF1UwiulzTfhVb46HpcJ0VjnY.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-A3dlau1sbQjz7w3G5hI2QaQC66qi879-iaHWrbz6wps.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-Ka5x6cJj-jyHuyBYprun34XmT-H3O6pWMtLRTToU-PM.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-DPpJ88dL6mufvI9jRxgp0egmHbitBVwCp2kpwu7d_e4.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tCTGSvfho8lJSmEG7iRxcrBnbxqX9Lov2LSxM3Pynes.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-CH0AODLgSUOHkdv0vrdXSRg952zimScHfmeA4Kwi7WU.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-Bf6gRiNChHqVmpsg6dzQjQC5a1TC1J_AX9WMx3oIAgU.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-IR4dqUUHSc9rXDBajekdEDnSOk57HfA3CdukVIF4Z7Y.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-zvPivYpe9gCAMyCChV2SzrPgeDQ5-5yCKDs0zMdd0-w.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-F9ifQZ9-scW_VOVComh1bUcP7zQtf2PPW15SbT6UTcw.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-P14mNsxgh5U1IeqOucWiR_hR751n9ZZvrdEc0aXtfE0.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-0crgmNsrK-auNX2Jyxfad2DM2CNfnwI8E3QP1cLTfKY.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-gZYT79zUNE5xobgJ6M2z4jnO4TsH3FQFVkKwJgMs2_k.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-LBvfa95WF_jbPKQGsDtQoJO5ElaGsr0bmNDLgL6QEfk.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-wI8PNmIjI2U0-z1J5PuHu48w_ZbGZOvZWh0nbOmhIR8.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3337289228418764359.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-yt8fsjL0Qwojn0oqFmaYn5g0F10yPVpfWLhBv4kVvnk.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-V_gc2Sia7Zx7X291v5HF1UwiulzTfhVb46HpcJ0VjnY.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-cC8SzZ-4dSrAK6n1BjdRszlHzjjlCnK-sxzAT9Cdpic.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-A4XAN_yPb-7g5DsA2uWeIOYuZrb_j9anASFwWwg-EbM.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-aUdmrU7vzYiNF-NXhtMZFDHG2oOzdJXuTjZDoaDx3kU.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-On4cqv5YtLqohpzdjFYNDUM73ypccXAbnUZhg_KBQRE.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-rqyqKrgzo3wTOL-c7mRnRkH3gaaUTYdp2cR-LTtl3M0.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-hQhxwOtlFWm5dIlvyqppo5DdilDzbJDWbPcFQ_IbZIQ.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-m0OHoKLPJjKgddk7CjVlubOvvNDpvA4NQjUoYSKqC9U.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-L-eDU-tEpO0sSJB-Ir4rKZAghsA947fQoA36BMiXt2o.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-TJZ7fblmu_OCGg8uZyd9N7SLC3lxTmMonBXyI53zzQo.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-vIXn1I7z6GSl9G6PO5iKPR9QjJPJe7cubZrrRjAyU1g.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-TW5WDxb1tp7aF--R_EogR15goEG7mJNxgKIleO2qvDg.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-UbsLrR4atN2kpkRZaGCLYuvu3u881oYwhA68ucLnZGQ.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-r_SbsXk1sIGiYR7SrdPdij8d1X7rSPEZXPi4BKw4gq8.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-OW9h6bH_b2OoGXv9ccndD2T3X4WODhvqk8zveogcIkU.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/5.5.0/e0845217c4907822403912ad6828d8e0b256208/jna-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-5.5.0-swj66_5O1AnehBDgpjLRZLISawNfbqz_lo05CMr7TZ4.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.visible-assertions/visible-assertions/2.1.2/20d31a578030ec8e941888537267d3123c2ad1c1/visible-assertions-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/visible-assertions-2.1.2-RQSulosjfNzcto_1sHqmOr5JkvkHp3w9YSCqm5BBQBw.jar
    Oct 24, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.duct-tape/duct-tape/1.0.8/92edc22a9ab2f3e17c9bf700aaee377d50e8b530/duct-tape-1.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/duct-tape-1.0.8-Mc7xLd7JedH4bXz3CMQaF9pSPQXGhf1mQunQsq3bckA.jar
    Oct 24, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 186 files cached, 33 files newly uploaded in 1 seconds
    Oct 24, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 24, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 24, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 24, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 24, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 24, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash 47166df537718a285a5539b3ce20de1cc6a40773f379e18c07452cfd836ddc50> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-RxZt9TdxiihaVTmzziDeHMakB3PzeeGMB0Us_YNt3FA.pb
    Oct 24, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 24, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-24_11_45_24-12089962121552579933?project=apache-beam-testing
    Oct 24, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-24_11_45_24-12089962121552579933
    Oct 24, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-24_11_45_24-12089962121552579933
    Oct 24, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-24T18:45:24.755Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 24, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:45:34.607Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 24, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:45:35.247Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 24, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:45:35.329Z: Expanding GroupByKey operations into optimizable parts.
    Oct 24, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:45:35.367Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 24, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:45:35.461Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 24, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:45:35.489Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 24, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:45:35.515Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 24, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:45:35.546Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 24, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:45:35.930Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 24, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:45:36.012Z: Starting 5 workers in us-central1-b...
    Oct 24, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:46:00.780Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 24, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:46:01.692Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 24, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:46:18.297Z: Workers have started successfully.
    Oct 24, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:46:18.332Z: Workers have started successfully.
    Oct 24, 2020 6:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:46:51.504Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 24, 2020 6:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:46:51.692Z: Cleaning up.
    Oct 24, 2020 6:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:46:51.761Z: Stopping worker pool...
    Oct 24, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:47:42.209Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 24, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T18:47:42.258Z: Worker pool stopped.
    Oct 24, 2020 6:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-24_11_45_24-12089962121552579933 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c3d70e65-29e6-4ecf-9c5a-48c41eb42254 and timestamp: 2020-10-24T18:47:48.393000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.739

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 24, 2020 6:47:48 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 38.709 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 31s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/b4txoqcjxdu5k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1154

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1154/display/redirect?page=changes>

Changes:

[sychen] Add a well-known ShardedKey coder in Java/Python SDK


------------------------------------------
[...truncated 280.70 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:150)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 24, 2020 12:45:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 24, 2020 12:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-ADrRWA1rwGNO6ZKhrVu-QcU-dLe3H3s3JnbHdUzKkjM.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-_tW6EC1G02V4c9ro4OUlzOzq05wAerX6fBMOUoulBVM.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-jKl0UVsR__zLwnJ77w-K3sR1VWCr-HdhM1LUgowm3G4.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-77FhB4PFvA8UA8QyiFJKXoXuXZTqX1_xsfAUrompUxY.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-OvJT2fDJ9dHLLidB0xycGRxjKnC0wxfBoY2YiBVtcFk.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-UHM2WRLeGs12wgjIlP-vd2g1nIzzrX1q-8ED5w-ATTY.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-diTB2vBWAwXG5kbUdBvMKV-cVPLbEWvb9-7HGYfA5FQ.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-zFJcgCtxwh5RCZCNSt1r5LZMICzSw0f_Dq-hMfJgBX8.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-O5jrsc5X2Gv4iaexSV_3QxVeEhbs7cWi8CcvW23d6-4.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3956087330400090588.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-5OxJ6auxNURWAN1QTZ8-m8zOy8M-ELm0dhcC0ae7Zr0.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-in4jkk4rt7WWmt1_1xdsaSQhoDhCOz_i08nP0NFRo2k.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-fhvRyV1mK3Y6Y8Rss1NS5YHJy4opvIEym4xmJw6sESM.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-C_ZNAoAgp_JGe1tZtIoUcHbuORQokCyejOSpuS6iKik.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-RYaJTnvk9PrL5rJxGi2LGD8akIr5ldigcDesYWuIhyw.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-28nbStW4ka8V3GefVMWFXZe5KVfUe2NdnGNSgCbiZ90.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-nxRT71KgDgYN90cCbE5g6z2KRCKktgO6OgGVivsSSak.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Lc4Fgb3SV05nR6LjeP2zcDXk_EWGLv8wrusuWu2A0IA.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-tjaLvFFmhpFx1ObmHEIkrRFXxGO2SD8lo_Yd2c90XNE.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-FL0cEab6gwJfIn7UeZBAAPdrW-TWWaRSuwnbawNdzMo.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-c5KvrtnsdqM0LdMnlAsc72qiNu-2KuSUdb2QmRMb_yg.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-5I8CmXQGTmsTlOMm0tWMZtvxzVOgnZefogHsP3K9Jew.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-G7tvfuEyAyXDRE0MMk-ZMileDY6m_RfZzpeWveC5Cgc.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-5ozXEfi_EgbvBpzGwFEF84nFX_j1dOo_EVYu4W6IuXQ.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-to6O3YDQanqXOAH038_38IO4r-UTO1KM6Rm2YtS9ZK4.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-wdIgz4eVnnen_E4aRoQ_5LlyTJ1Vsz46dyOD6YK5Q1Q.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-r6kJvnrqttoqN4IIqhAs5KVDzOOOyYqbYGt-mDfeYQY.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-j6fwj7T6O_8EayKvGDOOMj7X9E7YXLz0uJgfPr-z4Rk.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-_tW6EC1G02V4c9ro4OUlzOzq05wAerX6fBMOUoulBVM.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-0Akx50RRHv-YYNopagG9AnBc-OdtUz7tffkmb1QuVRc.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-Nl1kj176s0mzIBrM73nUk5dksbqT86s6mObzqDAP7Pc.jar
    Oct 24, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-mMikNJlTWHURGX7I1zCIJSBwMFhF6q8al_WJpSqq3F4.jar
    Oct 24, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 24, 2020 12:45:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 24, 2020 12:45:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 24, 2020 12:45:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 24, 2020 12:45:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 24, 2020 12:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 24, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96042 bytes, hash a785fa5b0dcedcd8982e2ee2dfcc5bf0a27bf01bde6f4d9ceaa927b23fec603d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-p4X6Ww3O3NiYLi7i38xb8KJ78Bveb02c6qknsj_sYD0.pb
    Oct 24, 2020 12:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 24, 2020 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-24_05_45_47-10866231806858623284?project=apache-beam-testing
    Oct 24, 2020 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-24_05_45_47-10866231806858623284
    Oct 24, 2020 12:45:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-24_05_45_47-10866231806858623284
    Oct 24, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-24T12:45:47.798Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 24, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:45:56.613Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 24, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:45:57.251Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 24, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:45:57.295Z: Expanding GroupByKey operations into optimizable parts.
    Oct 24, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:45:57.338Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 24, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:45:57.403Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 24, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:45:57.440Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 24, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:45:57.468Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 24, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:45:57.505Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 24, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:45:57.835Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 24, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:45:57.907Z: Starting 5 workers in us-central1-f...
    Oct 24, 2020 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:46:21.016Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Oct 24, 2020 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:46:21.046Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Oct 24, 2020 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:46:23.668Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 24, 2020 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:46:26.347Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 24, 2020 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:46:43.336Z: Workers have started successfully.
    Oct 24, 2020 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:46:43.364Z: Workers have started successfully.
    Oct 24, 2020 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:47:13.311Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 24, 2020 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:47:13.472Z: Cleaning up.
    Oct 24, 2020 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:47:13.551Z: Stopping worker pool...
    Oct 24, 2020 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:48:03.571Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 24, 2020 12:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T12:48:03.619Z: Worker pool stopped.
    Oct 24, 2020 12:48:10 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-24_05_45_47-10866231806858623284 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): fc36e8ef-224b-42de-9ac6-07b14941a46e and timestamp: 2020-10-24T12:48:10.075000000Z:
                     Metric:                    Value:
                   read_time                     9.992
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 24, 2020 12:48:10 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 36.638 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 54s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/smdnfexpxv2qm

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1153

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1153/display/redirect>

Changes:


------------------------------------------
[...truncated 273.55 KB...]
    INFO: BigQuery method is set to: DEFAULT
    Oct 24, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 24, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:150)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 24, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 24, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-J9HhIGSjwwxl2e302oBKMG1afqkx3zAd9TtIb2DiFM4.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-LhMhW_9T9d_Bjw1k2V036JNVYE6R_1lZdbUmvUllvFE.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-MqUmCXcQ9W9aLAvO6LZw-VH5aA7XciwfQWJscUwFZUo.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2853863709400411290.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-3CpWM4WyOS_B__x9vJhgcZjHJMb3IweeeEc3erz5l5o.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-teBjwmVSq9MgUcCYd_XiRzGx2q1kB8PSXFrgn0Tz5HY.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-ScWsIJIthmxI8pU8rXLDU5ykEnLhKqJjecXE6pkH2aw.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-udHrUbSMX7NIEg0cYhnhchd0Oz2VQpxWuH2bprsPmU0.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-TFl9pK-17-H3aRm8ZhRDQyZwgsrHyal6tF8jV9Yn1fk.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-z8befhbyCiyCKv0o1hkbUNXsz7P0wIz4AU8ut4K38qY.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-0Ds8qNMZvTDkfbbER9uNZgJdk-Oa2_4bi1iVJ62LIdU.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-RJB-NONJVHveRNrS8W2Z6RlUEo8FX8CYbarqZkBq0nw.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-5fUlQ-Yw1N2MHB8sFbsB3V-PUJvz7BE-q2FHokd57j8.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-kpIV3HybW3fqMzkgj00_Nv1sTiC5NpJ05AuErmmshoI.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-Dk6awPS9EqGmZw2whK6kWlUjzuRRFlTNSntvLPln7jY.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-SbWgCBOYJaa-bfYGdLwclNs-J3nKwW6P_LD-xcWtIuU.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-BqcoC7kFatTby2uB5KzSEXdh32_YDDhTqdmmSN6ZvXc.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-J9HhIGSjwwxl2e302oBKMG1afqkx3zAd9TtIb2DiFM4.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-OuDnNS6NxfyRYpxOwceozW5tgZzc94vsXLMyJS255gI.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-DwQgOQN71y2IZ-pv-1zBjPGDmzbdFrm2ZI5fnjn-vg8.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-Cl7AQQmUYHG3Q6BYrKuPvQWMqywQ0BUecLYq43WeBbc.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-X0sXrBt22L_1pIFD97Zj_jeZraq-lHIxfDpGaMy7Qsc.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-oALeMpkKSvjKL8ATOolkaTE6vX393CY15Aw-IGFMqJk.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-9Q-doecvxRFU7AuRGknrWtufKRIoAJHJlKBD-QysPbI.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-maUadc9pXpw_dRpcqOlZqeCZwtjKDyazstrWMG_qKdU.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-pbIEQ-yrwT5Uirin3BIDm7aEs1eVwbUyf7Xio4lVhUI.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-iQ6W45D5GIoi-786sIad-bOtXd1KzpFedx5qAIcDQZA.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-8AlBKk530BZjgLLvLyl17xl_5vak9-Zk4pdyTfvqFsc.jar
    Oct 24, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-IY_YhmYDU3DYWxbs-GF3zTzT_VWJ-CG4Q_z_qTBbefY.jar
    Oct 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/testcontainers/1.15.0-rc2/f818b00ee47532c38b418e677a818ad4dca19db6/testcontainers-1.15.0-rc2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/testcontainers-1.15.0-rc2-avdkCap6A5JPV3hAULK21O7FqzEYvsRL8Ui8VniQ_Ug.jar
    Oct 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/kafka/1.15.0-rc2/47ff228024171dc5e0d938bc3221fae2878c1473/kafka-1.15.0-rc2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/kafka-1.15.0-rc2-bqsYOX1YvlCvptLLM_-rMZ--vSLZslIdDL-9ioqzWBU.jar
    Oct 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-z091UeRlS8LCgPObPW6OgNBiRi_mZwiQywEzgZtC3Fg.jar
    Oct 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-Mx-n_z3IHfbIG_6NaFOn0LsjtU6qYwdrZKn4X90Jt40.jar
    Oct 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-api/3.2.5/8fe5c5e39f940ce58620e77cedc0a2a52d76f9d8/docker-java-api-3.2.5.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-api-3.2.5-eRxdklIyFsCE2_dXBCj1039ReXyYGkafqCMwJZWXx1I.jar
    Oct 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-gyBle8zxMFDhnoP00coWVw2nDZZv3-FK4HnP1TCWYR8.jar
    Oct 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport-zerodep/3.2.5/7fca5e1998c889d486b01695f098ecf760a4ac90/docker-java-transport-zerodep-3.2.5.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-zerodep-3.2.5-OwcPjMMl-Jjxn-QgBEUudex_Jf9MIbZiSBhFLY5f9vY.jar
    Oct 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/4.0.0/deb6bf66918989b50209b8c9aaf3b2561af7f011/jna-platform-4.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-4.0.0-B21i7Yfna9yzdQ_-gKpJXuIRK9chJmOVTWVH9IJEkuk.jar
    Oct 24, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport/3.2.5/27af0ee7ebc2f5672e23ea64769497b5d55ce3ac/docker-java-transport-3.2.5.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-3.2.5-QA3UXPFvvCRXrSNnNwbFWAm4P7tT_L9aCJHzxaijRO8.jar
    Oct 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 183 files cached, 36 files newly uploaded in 1 seconds
    Oct 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96015 bytes, hash ffb5303b4a855ab35c21f5c9e6b7170cb35d98d466cc07bd3e5e098ee84118cd> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-_7UwO0qFWrNcIfXJ5rcXDLNdmNRmzAe9Pl4JjuhBGM0.pb
    Oct 24, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 24, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-23_23_45_17-2431341590251687529?project=apache-beam-testing
    Oct 24, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-23_23_45_17-2431341590251687529
    Oct 24, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-23_23_45_17-2431341590251687529
    Oct 24, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-24T06:45:17.610Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 24, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:45:26.168Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 24, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:45:26.836Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 24, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:45:26.879Z: Expanding GroupByKey operations into optimizable parts.
    Oct 24, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:45:26.919Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 24, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:45:26.980Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 24, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:45:27.012Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 24, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:45:27.048Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 24, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:45:27.085Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 24, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:45:27.519Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 24, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:45:27.604Z: Starting 5 workers in us-central1-b...
    Oct 24, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:45:46.503Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 24, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:45:50.989Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 24, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:45:51.019Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 24, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:45:56.294Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 24, 2020 6:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:46:21.862Z: Workers have started successfully.
    Oct 24, 2020 6:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:46:21.901Z: Workers have started successfully.
    Oct 24, 2020 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:46:53.352Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 24, 2020 6:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:46:53.506Z: Cleaning up.
    Oct 24, 2020 6:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:46:53.594Z: Stopping worker pool...
    Oct 24, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:47:45.178Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 24, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T06:47:45.217Z: Worker pool stopped.
    Oct 24, 2020 6:47:54 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-23_23_45_17-2431341590251687529 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6e64ee24-cdc3-44f1-b549-00cbe4a9719c and timestamp: 2020-10-24T06:47:54.566000000Z:
                     Metric:                    Value:
                   read_time                    12.687
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 24, 2020 6:47:55 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 51.162 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 38s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/xukns6vpufoaq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1152

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1152/display/redirect?page=changes>

Changes:

[noreply] Move pymongo out of the optional header

[daniel.o.programmer] [BEAM-11108] Add a Size method to all Go SDK filesystems.

[Luke Cwik] [BEAM-11111] Fix website publish to use stable directory so that .git

[noreply] Fix nil pointer exception in `graphx.Marshal` (#13178)

[noreply] [BEAM-9682] Add assign timestamp lesson to Go SDK katas (#13129)

[Luke Cwik] Add missing . to headings to match other headings.

[Andrew Pilloud] [BEAM-11114] BigQueryServicesImpl.isTableEmpty uses jobs.query API

[Robin Qiu] Update Beam website to release 2.25.0.

[Robin Qiu] Add blog post for Apache Beam 2.25.0.

[noreply] [BEAM-11117] Clarify 'execution of work' messages (#13186)


------------------------------------------
[...truncated 275.85 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 24, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 24, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 24, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 24, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 24, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:150)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 24, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-uvLQWwkRBsBzpaHkIv2u77S0ZbFHqg4HBa8bq0KJ3Ws.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-NFo8Wx2mMYCqwBxnbFTTFxvNjao4IV31v6WGLpp5Oig.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-hS_BLHZATkJ6oWj3Ee_KqQxsoIm33Kswl2iGpwtxdcc.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-NjHwJzU6rNRhQPqVRdhxlkNbk84QfZFU9RNLti6N-mE.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Fs4ErUS-HFCeq1TYb832Q29zgqmdfaWtu-nRMC9GdqI.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-6DFPlb_Rq6FPZrZdgNMFJfWatCsAOLlgr4813DlolTs.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-RdJMcwtbHgLJxJTTRu0He-YPPKTYnsA7Why57ZSADcU.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-oE2E7ZqKLIBELYb1K5tIX5oHZAvFr_rtY10nYB6wZmU.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test268511189407782549.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-UgS40UY64UVa0JO_2CoGNzR_jSVxvY-K0CaSJy0gkDE.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-89jL1WXQ9G9P-mzJdU7uJl2YrBGK3oEAmWwdZTmaCeQ.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-iZRW-gX12u425rZqJgCAJhj3tF_kJGGjuIMPwjxAobg.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-3hGWyz61Lv73lGccMJzdbUnVFRmM5GoXmNYE9G9DDEI.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-3xDTb0eqC0B-SmFUbGnuM6ytsAH0gHmYsUlijpJjFmA.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Af5_Fa7NvqO0nO7wO5mN4JG6YTrF3xJjvcRm3wCebi0.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-xuZv2XuB5i6_-AsXizKPKUMRrYho2PODp7Y4IxHOKZA.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-y5weogMaY8RHMJrs6_FXmeQkr4I2SA8PZyPn5vqPPh8.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-Hef2SE2mCMOcR3oDIDrQbpGauyada3qDYOH3cMpABe8.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-YzdJ7GQAiMUtPz2HUatgaISleJNy8piMaHypQlyAlrw.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tFOJnqredJrWrADAvu62atVihgWJILtjz2ci9x_zXMI.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-p8RwU0a-OCrUEKj-bOg00kbA6OWrnw5gM2kCWBNNFl4.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-zr2pljpQhy1ct-LmGa1KUqzANOpAOTZ8Z-y5Ach3eOM.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-WtiOz_oKlA5K_nocax1kS1LTVIJwAfY1riSBH-My8tY.jar
    Oct 24, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-ZCauC3gPdqqRHNt5s1738-VPF2h7NZMW4mlRLeTexGo.jar
    Oct 24, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-lYP8QiWRvs_NrYzL8gwtsY_9dxWaHb1awu99-3BUq80.jar
    Oct 24, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-wKenMm7hUinmtqwH63BRCw3nDlFeVnjIMWTvJYvRaqM.jar
    Oct 24, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-bLMbeSlcsfssO34frabmJa7y4MeaiFicvHvj091ckpk.jar
    Oct 24, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-r1Bd7PoSDsmKjfXeS9j5LZ5xM4-lo1tZefOyC96bMbo.jar
    Oct 24, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-uvLQWwkRBsBzpaHkIv2u77S0ZbFHqg4HBa8bq0KJ3Ws.jar
    Oct 24, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-5ZcrU06s69zOF82gd0ZkzaqhbvyFO3X17X7bZzkyICA.jar
    Oct 24, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-tVy1I1h8PXifjIbxLkTcgAK9hvoifchzACei1Z8zBmU.jar
    Oct 24, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-_u-8gF1lAojE01ZJtRpkTqbf7JU0x9PYIAtFdsu7q1c.jar
    Oct 24, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96014 bytes, hash 60fd53b51da2e1b741ad875bdf38066e1b41952ca89208018bc1494613cd5bcb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-YP1TtR2i4bdBrYdb3zgGbhtBlSyokggBi8FJRhPNW8s.pb
    Oct 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-23_17_45_47-3432551801304190034?project=apache-beam-testing
    Oct 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-23_17_45_47-3432551801304190034
    Oct 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-23_17_45_47-3432551801304190034
    Oct 24, 2020 12:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-24T00:45:47.605Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 24, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:45:54.563Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 24, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:45:55.177Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:45:55.351Z: Expanding GroupByKey operations into optimizable parts.
    Oct 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:45:55.437Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:45:55.511Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:45:55.550Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:45:55.588Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:45:55.621Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:45:56.018Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:45:56.103Z: Starting 5 workers in us-central1-b...
    Oct 24, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:46:11.298Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 24, 2020 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:46:20.059Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 24, 2020 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:46:20.101Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 24, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:46:25.348Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 24, 2020 12:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:46:39.861Z: Workers have started successfully.
    Oct 24, 2020 12:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:46:39.892Z: Workers have started successfully.
    Oct 24, 2020 12:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:47:25.417Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 24, 2020 12:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:47:25.601Z: Cleaning up.
    Oct 24, 2020 12:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:47:25.681Z: Stopping worker pool...
    Oct 24, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:48:14.260Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 24, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-24T00:48:14.325Z: Worker pool stopped.
    Oct 24, 2020 12:48:19 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-23_17_45_47-3432551801304190034 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b26a534a-ad30-412f-96a5-73f04e3443a6 and timestamp: 2020-10-24T00:48:19.948000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    24.628

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 24, 2020 12:48:20 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 47.365 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 3s
107 actionable tasks: 70 executed, 37 from cache

Publishing build scan...
https://gradle.com/s/kw3wmeyvrtrzu

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1151

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1151/display/redirect?page=changes>

Changes:

[noreply] [BEAM-9547] More complete indexing capabilities. (#13122)


------------------------------------------
[...truncated 272.31 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 23, 2020 6:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 23, 2020 6:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 6:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 23, 2020 6:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 23, 2020 6:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 6:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 23, 2020 6:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@532945473]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:150)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 23, 2020 6:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 23, 2020 6:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 6:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 23, 2020 6:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 23, 2020 6:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 6:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 23, 2020 6:45:26 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 23, 2020 6:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 23, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-NB1HD1hMfKKACNlkixm1uyMl9fFpu0CjsNcq_r8EVVc.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tQo1ze0_hR0Qp87HVoPrV9vpqQDBjzId5c_R1_UbBDs.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-i4jVrpLuMHc3LQKhiauZndvkzvTxtSAyKrOU9yLdZsE.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-g4x9B49z04gscFHexj5g8-oRBWNHVOXxOrs5tzthZtA.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-J6djIt-ZIwaduvqzVKe8w4EYHcajWQVdh25HTDzyN2A.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-0KEPRulT6xf0Tsva3OKfcRh7SniCofVeSSDi3GlfVwY.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4121615539883461266.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Od2Jsiq416K7Clv8F_NOjyang_Xg_0kZt1NGFOn6a4k.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-mvhbOy4wyx9gknmjKbtXJ7AOOlS8KNYW5req9GqTeWM.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-RYOAZ53ry8c99KKX8kuTeNcXXeM0iNGU76c2gxTydCk.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-H7HYdz0GKeW1IdEZ5zxhamAOSYJRlzPD22rbwGZG82c.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-IajcL_v5GrEV5d-4kVOfN6iNWpTN-GdOLBTSIJRJyuE.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-Ws9Bu-JVOE4fqjbWxiSbHZj8WcHEV135fljPT0eDcUc.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-rBJAw8kZ--3sSsLYSU3YtGQLPi1a16DIH3Ho14nxKk0.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-cz6Zt4_3OTxS-oXMy5utk3wtEWomGATLECmQVWwlq3w.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-LqAEOKNPgEz8QxmxLoKv_4-TP94IV13bc0m3URKoh4U.jar
    Oct 23, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-Japk-2IOf75mQplauy8e0iOvkV_dY1w7B5q8zPRMvdE.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-NB1HD1hMfKKACNlkixm1uyMl9fFpu0CjsNcq_r8EVVc.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-gKYfEeWTUTQpjtxTwqZDdZW3oPkNVCssKTWgJ30VhJ4.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-nopG52W-6hsttHOWmIDxPDPwrmPL6W7r5MIPXw84MWM.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-L9BPhm-C0oNf58wZf-z3IJE97o_rnEIXnY85YyclUak.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-IweOlH5EF7IkAtHDKf6DXN9l3xV3WXapZXgAr2EW-Xo.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-3gVh0gKfmaTVsL0JfYZsDzynflusEmcQkLcfrziO42g.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-6LmG21it3_zFt-j2Uojzrrrv-Gr3mvouELf3KdD4MNU.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-MDSDeW0_FY3DVswS8isxkVWpaCzf946p4vQlN7GTt8A.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-NvmW63CV6dThNDNQ7hJUH8NM7brlFxnR_b-p-PE9hF4.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-T_GXiVtJjSKkBjCs3wdNb720SD9SgW549TkcmvH2j2A.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-ffVvptyVl70gJy3zD28B7dGTOGZ07d8jLc2Zp5XEoGo.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-C8v0oUlW8OuxjpAT2npCXCr8_NS65CeX_lRIKqtNiK8.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-xsMOo0zmN4OdWKggbNDU_WN3vUENVPmyIdxNnx61CpA.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-WNRfttkQQ_XHBh2KRS2xavUSKzXbfdyzKW87JgGWKKI.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-FE-UUci6mBRBflO4afe3T-p1RyHbZI9NvQ_xdjGEGWw.jar
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 23, 2020 6:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 23, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 23, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 23, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 23, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 23, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96013 bytes, hash 16628eeac080859dee9cd56b55324970a6ab999a2b2ec839513741ba9af3208e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-FmKO6sCAhZ3unNVrVTJJcKarmZorLsg5UTdBuprzII4.pb
    Oct 23, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 23, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-23_11_45_32-15232256674371770788?project=apache-beam-testing
    Oct 23, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-23_11_45_32-15232256674371770788
    Oct 23, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-23_11_45_32-15232256674371770788
    Oct 23, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-23T18:45:32.694Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 23, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:45:42.290Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 23, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:45:43.201Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 23, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:45:43.257Z: Expanding GroupByKey operations into optimizable parts.
    Oct 23, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:45:43.290Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 23, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:45:43.377Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 23, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:45:43.418Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 23, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:45:43.454Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 23, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:45:43.488Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 23, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:45:44.202Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:45:44.282Z: Starting 5 workers in us-central1-c...
    Oct 23, 2020 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:46:09.795Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 23, 2020 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:46:10.264Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 23, 2020 6:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:46:38.299Z: Workers have started successfully.
    Oct 23, 2020 6:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:46:38.333Z: Workers have started successfully.
    Oct 23, 2020 6:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:47:14.268Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 23, 2020 6:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:47:14.463Z: Cleaning up.
    Oct 23, 2020 6:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:47:14.558Z: Stopping worker pool...
    Oct 23, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:48:08.220Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 23, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T18:48:08.333Z: Worker pool stopped.
    Oct 23, 2020 6:48:17 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-23_11_45_32-15232256674371770788 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0b3a79c8-772a-47ee-a21c-cfa1c6ee5e7f and timestamp: 2020-10-23T18:48:18.100000000Z:
                     Metric:                    Value:
                   read_time                    15.732
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 23, 2020 6:48:18 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.063 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.123 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 3 mins 5.039 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 2s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/2mpr5imy3gglc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1150

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1150/display/redirect>

Changes:


------------------------------------------
[...truncated 271.51 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@483472390]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:150)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 23, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-9FMTFaQf-_8oAbGgHSAYtxQNts39vPpCi7PwjHiMQS8.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-UGM9E2EDdyW6i14Vl-WYZUFmg8gIwL8t253-KEYpU3Q.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Z12c3MtUXPOlIIsElP1O-wN5MbG0EGIWjAuXD_CPt0c.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-B4JgB-vWc174fZiMYnu5fk8pYe1rRA1qDSzNaEWhJlY.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-vm1zYFkv3RzRR8kCjb4W0YAAGW-DwasMHF7egAWZ0vc.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-6kOc7Zfep6qSB3akah1DeLssrQckltLMAVXRZJNrhSo.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-eoWcXOspv0hGeRuTGVrDF0TeKPzcCJOmAn27SclZNs0.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-S-w6Y40ZTdaMrQPRcto8DYHCZBsu9tN-XR3HgZic8rU.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-uizG675MH_3ahmdp8m7raPqGTp6n0V9rDOl7Easrloo.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-rOfoHhKL5zYeLkF-iBOIpP7rpWpMBwaqOjnGwYGTPpo.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-ALVvVCqFfLEx7NNRDf56MHyb3jpvpxgTaeKTdpWCdGI.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-NMJnaMIxM8Dd8py_nxDTQyUjsu--PxZq25FMldZXqlM.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4147215693145151159.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-f1yty9OK8BzeFuiYxsAlzDOGSukX3rDPxy00CvgQd5w.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-udn8jBJxjn5w5EBEk3-qhdPQHbilJ1kAlNZnJRIYKNY.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-aqsH2nMIV7_bIyQUhX2un1IVhHN8tniJrkb70K4YoNg.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-lAKtXAuE-CepTQA2vqUaUPEdsSopDgvvA3YSFSHsxQQ.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-9FMTFaQf-_8oAbGgHSAYtxQNts39vPpCi7PwjHiMQS8.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-KhvvGtwgSOyzFsvShWvvtOgAVptp1JrwR8XMbR8vIgI.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-9jbZjC_7Ro5eJxKeHpmDISQesss2E5jHoEXfnR1wmIc.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-6HuzApDbDKQDGRpI2VVLRdw5gdzXUAD-BW-1Oatne78.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-VT05VUu8foUPdnpe9Mk_LovkGwnwmF7cVvFmkA-Cr-w.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-5VH1cSYuCU00afUcoRgSIbNljlbUc543kNBnAHapzP0.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-uJSmN4HgOptch3JZUam5O43jehsVKQlWWnavHQA0WD4.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-WQTrFc7S0CbXm8EveeLgrJ2gcxslVov6wfC_ZWErV_U.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-a-kX3LkuIJRq_fmklkk_vx9DGDQIcjUgAGOoDShHT6w.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-QR3o-dHzhoGvy8clHNd0XYKfaImxyaQvTJsPYuxyi7U.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-m2prkEeKYhxroXgfdiSDzXVyRUHGUsbzLpM13FDljtk.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-EhIybOFtN-o49f1XpWYbPc-uU4BSAMZvNJSsCRWIdlQ.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-xIklrjKAZJvEi4AtDstiCKE6i6LcX0Z0THrrDquJ5gU.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-zO8IhP86JPCt9uPwAyFTZ-4DPrIC_iK8P7Xy_5mph9Q.jar
    Oct 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-CmAsoSQH018mbJ-SfMm6P_IwiC8LF8Xc_cZT-GwPM6I.jar
    Oct 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96015 bytes, hash dccbf9a19eb9a32b4ed8b373c43c6aeafac8c455887df3bf22b56f3077775617> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-3Mv5oZ65oytO2LNzxDxq6vrIxFWIffO_IrVvMHd3Vhc.pb
    Oct 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-23_05_45_19-4254265136064416330?project=apache-beam-testing
    Oct 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-23_05_45_19-4254265136064416330
    Oct 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-23_05_45_19-4254265136064416330
    Oct 23, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-23T12:45:19.786Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 23, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:45:26.140Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 23, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:45:26.829Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 23, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:45:26.872Z: Expanding GroupByKey operations into optimizable parts.
    Oct 23, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:45:26.894Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 23, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:45:26.960Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 23, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:45:27.015Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 23, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:45:27.043Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 23, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:45:27.064Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 23, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:45:27.461Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 23, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:45:27.540Z: Starting 5 workers in us-central1-f...
    Oct 23, 2020 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:45:58.537Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 23, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:46:00.457Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 23, 2020 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:46:16.977Z: Workers have started successfully.
    Oct 23, 2020 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:46:17.006Z: Workers have started successfully.
    Oct 23, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:46:55.158Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 23, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:46:55.403Z: Cleaning up.
    Oct 23, 2020 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:46:55.474Z: Stopping worker pool...
    Oct 23, 2020 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:47:46.988Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 23, 2020 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T12:47:47.035Z: Worker pool stopped.
    Oct 23, 2020 12:47:52 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-23_05_45_19-4254265136064416330 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2cf6a50b-502b-45e5-a356-9938a19737e2 and timestamp: 2020-10-23T12:47:52.736000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    18.535

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 23, 2020 12:47:53 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 47.016 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 36s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/lrek25g35ytiu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1149

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1149/display/redirect?page=changes>

Changes:

[ajamato] [BEAM-11092] Add protos for new process wide HarnessMonitoringInfos, and

[noreply] Update metric.py

[Boyuan Zhang] Populate sideinput for SDF.

[noreply] [BEAM-11078] Add splittable DoFn documentation to programming guide


------------------------------------------
[...truncated 278.42 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:150)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 23, 2020 6:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 23, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Lyz6XulGiXh0xuKZvDAawlq429PNj6-VSVwGx741x44.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-eI_uzumQvuRfhXkQCf4aNqZh_U6VqHdPPh2Do0uvQFI.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-S51hGpDnJeIfLqwOrlaZIZc6PmQIAgkP9yKphoHblIs.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-TZ9AZxyvddXGNrdACI8VyjnTg-Rx6BzTrar90bNvl9s.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-8kXX9zebh0KgHn23LjrFVoto8w_4XRGvYS08jaOt-5k.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-8zNM0mrmfOtS133jzouNusH0Qjw7UKhkPNKloikOiP8.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3963568572600089110.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-pd3w-yVxO4fnjQh3A4egQ3FXVW-3JFS0sN9p1J1vshU.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-jv3KK-W5W47pGYZcvEHRfe2jVWBnHDEDohJbdP68kho.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-ajflsRBhjqH9kuqplgyouG_Oo2omCPgVs5GepgxBTPQ.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-QlYsKM_a9o6gOvF8Wz5GHHILag1aycLg5iQvMbX8VJw.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-HdzJ-pwy0vtxlFkO3QuXI-vZtFVkILSrzG5XrocGs5Y.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-H4KUnyQoEbmt56R-ifMWlp3dfelp7BW1ulThNR2D_m8.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-IzPVX2OVXvJbzLn3HjY4n3J2LlLrRjPHRFJlmgzDqHk.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-JgxTdxDpT-GV6CjyQpVaZ9_CH1mZfXqiYY1lBSE1jxI.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-O8uRh83WHtkP4begpWEIFitzAgtEcOSFrzW0zVpBJ4k.jar
    Oct 23, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-Us0Pzlywm4Jnl8ASv4ciudGDJp8eQJCH-FsrMUbQ7AQ.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Lyz6XulGiXh0xuKZvDAawlq429PNj6-VSVwGx741x44.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-iMoupJtyyYMEz47JRdjFbe56lOfPbxZNk_d59gCaEqc.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-waalpeuS8-9Ulk8NZy66TijaKR_K0zOCDbsu-b6G2xs.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-kBgbu45UbYNQQuDwNz_yyCQulc4ZJiU9g-_lB9oMv1U.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-ewAfK8ufxFViDjQJoY9LTIwHCjaDT-L8RifcoSTInDU.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-CrUqwebS2Ce4hfHMyrqFK4N6OLyU8uqU6W0f25rSZU8.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-lV0yjJrmG08JcSxdYbCTPiFqvpop5wtkXuV_x-lRGXA.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-r-mHU-iuQOWYP8wC3vGNM5jML3eC1d2xL30t_gRiT-g.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-1AeghF0Qjkg36jb6KPO4PEidnBqmpoVCfWlhScBNXQY.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-y5QdA8eL8vbh8Lt2NOvEkcP0s99HeNecsfWmqd_8sCs.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-oZypvoUgHwK-VmMzv_QxsJGRrqkfxxjdZYz7mr3SyY8.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-0a5b5M1VGbxzBTDI-6Em_PsAjqJ2AplIosyWmVJ_ByQ.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-SyCUpe1tg9EFsW4dsgSowLHLkizirhiQcWJjz2NU_zc.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-zi4IxLx_vZeSeXw0KDlFABlxzQ2LdhPoFHZP6lSgYi0.jar
    Oct 23, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-TvvDVV6KhfvWbTmZqpB29uXxj5U2t0XS-eOpauYK2sk.jar
    Oct 23, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 23, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 23, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 23, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 23, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 23, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 23, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96015 bytes, hash 36803b9739256a32a2a916d52ea8ccf77465d6b8219a78cf1fc792524656772f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-NoA7lzklajKiqRbVLqjM93Rl1rghmnjPH8eSUkZWdy8.pb
    Oct 23, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 23, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-22_23_45_45-12745538004906896183?project=apache-beam-testing
    Oct 23, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-22_23_45_45-12745538004906896183
    Oct 23, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-22_23_45_45-12745538004906896183
    Oct 23, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-23T06:45:45.705Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 23, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:45:53.232Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 23, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:45:53.880Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 23, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:45:53.918Z: Expanding GroupByKey operations into optimizable parts.
    Oct 23, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:45:53.951Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 23, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:45:54.026Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 23, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:45:54.064Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 23, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:45:54.101Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 23, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:45:54.144Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 23, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:45:54.582Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 23, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:45:54.707Z: Starting 5 workers in us-central1-b...
    Oct 23, 2020 6:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:46:09.793Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 23, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:46:17.890Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 23, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:46:43.789Z: Workers have started successfully.
    Oct 23, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:46:43.825Z: Workers have started successfully.
    Oct 23, 2020 6:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:47:20.628Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 23, 2020 6:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:47:20.767Z: Cleaning up.
    Oct 23, 2020 6:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:47:20.976Z: Stopping worker pool...
    Oct 23, 2020 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:48:10.126Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 23, 2020 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T06:48:10.163Z: Worker pool stopped.
    Oct 23, 2020 6:48:17 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-22_23_45_45-12745538004906896183 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ddd2e5ce-655e-4c33-bd1b-ca425513a866 and timestamp: 2020-10-23T06:48:17.791000000Z:
                     Metric:                    Value:
                   read_time                    16.571
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 23, 2020 6:48:18 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 47.05 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 1s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/b2y4bl3zwubus

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1148

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1148/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Make checkerframework qualifiers compile+runtime scope

[noreply] [BEAM-9547] Add several simple methods to dataframes. (#13157)


------------------------------------------
[...truncated 280.84 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 23, 2020 12:46:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 23, 2020 12:46:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 23, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 23, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 23, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:150)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 23, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 23, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 23, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 23, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 23, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 23, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 23, 2020 12:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 23, 2020 12:46:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-9lCK8vNheZjgVD_1FWiYEa4cS2hiAd2uClj09U41l_g.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Obt3ATh5mT_fLa7etfiDo6Ko5k8WY6nTWxwKwHNTNsI.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-IvsnUSsZJe_yTDKyW9ARHwIHWxpukMXZqOhdi_WG1Z0.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-ngPX01bXlYtAAobxPdzsoOQ8ftC3oC3M5233UJrXlP8.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-M-iQqt6ivl11SLimQEmMfNnBruTar0FwHK-3BgNMDck.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-KeYsIYt4Pv2MUbptj9bg4WgJg303_fY-Y6yHKFXPr8M.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-76HT6A0qzeiwRTszpocqW3yNygrHo8eo7I3tpV-ScnY.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-bQfAem2bPRWQemRkPN1tRIl4vKryQ_S1pgfSYHRjwpU.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-hlqyzvRHbWKg9FfwgsDYAnTSjeRjRtO2uw0XtiW6_vU.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-IzZE8ewlXYhmjVRoKJSf-5752K3cuLNgxIi5_aN3fU8.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-AHzD5F1G51LpsJ3EzEh1z_tRzOU3-0t8QoFUH9LeiBM.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-g42zCmGlG29XzW8BtJlWDCD9dyNQw-9HSoq8j7BpHeQ.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9013299756921381369.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-DBzyIoStFHie2stkoFeKEo9AzzN9e1cJ_D1PQoqmMXU.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-iNZNUW10yW5hV2G9QFlaRVUEEDUa_N4qgdSU8NXVzfY.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-3GM9NQfpjWY5waVKJKjTKzwHjK62o9X2MzXc-yR-Viw.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-dYdrubnZnbGJpWGyxdO00uNv0Xcsg3uhg5LAgaA4fus.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-9lCK8vNheZjgVD_1FWiYEa4cS2hiAd2uClj09U41l_g.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-fsOWC37cKTdLfG9BLOd2XiV5QVkZ0Ua12VkndSBn8Mo.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-2_4Dru1MFKWfAHctTPvOXtnB9KMK0ZNWFrwAua-nmWY.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-I2UtC5AcazeGaVPLhb5_KaUgEXfTP_cP9s7moQB_4is.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-lubkHXE3HNTl2itT5LhcbCs2BA84cfmU1Tv31oBksOg.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-vN-lKo634wmKkGB-y9TRdIH1-a-1YDeA6oUThx7_pKo.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-UxwHacw5NxACBkrQHFcwt-LRLYVoAbxZOvM2mj0DP0o.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded--jD-odRyQTB9QY28L6GEfQ9M_wKz6GKK6TQcKzBZI_k.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-0GP8mDbfcANkuj8w067ObifQnDjBVPmrQfU6pZlrMpc.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-ynT6TMLKnylR8z1h3YCcFH4SF1RySbIBOqY7LvKLA_Q.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-jwpEEjwXr4YEEL2tNHClW5dq7FvRtxC29Yaa4RK5Nl8.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-GGbGV8PXpPVgv-8hkS4EaGMHxgHXdtMz4Fzzoa3c4Z8.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-U0i6I0cUP2oDao5hM0KzKgp7CZrJyoZnSZzTjtgsP6w.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-SL19-PDTMJCS2KP4mCDWiYXByi7DWLKaSsUHAL_E83c.jar
    Oct 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-PRUnLYzuG_b_8mCT0S0-viDt0BNZuLHzvHfq-BFZDtw.jar
    Oct 23, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 23, 2020 12:46:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 23, 2020 12:46:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 23, 2020 12:46:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 23, 2020 12:46:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 23, 2020 12:46:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 23, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96015 bytes, hash 090c3be9a0e20244b223a6e4f81368f11348810e219b7eca75e79e3a9efe58ef> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-CQw76aDiAkSyI6bk-BNo8RNIgQ4hm37KdeeeOp7-WO8.pb
    Oct 23, 2020 12:46:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 23, 2020 12:46:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-22_17_46_07-4659547575342290976?project=apache-beam-testing
    Oct 23, 2020 12:46:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-22_17_46_07-4659547575342290976
    Oct 23, 2020 12:46:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-22_17_46_07-4659547575342290976
    Oct 23, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-23T00:46:07.938Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 23, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:46:15.375Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 23, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:46:16.231Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 23, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:46:16.282Z: Expanding GroupByKey operations into optimizable parts.
    Oct 23, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:46:16.317Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 23, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:46:16.401Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 23, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:46:16.434Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 23, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:46:16.465Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 23, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:46:16.495Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 23, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:46:17.003Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 23, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:46:17.099Z: Starting 5 workers in us-central1-c...
    Oct 23, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:46:34.942Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 23, 2020 12:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:46:40.048Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 23, 2020 12:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:47:04.829Z: Workers have started successfully.
    Oct 23, 2020 12:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:47:04.864Z: Workers have started successfully.
    Oct 23, 2020 12:47:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:47:42.534Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 23, 2020 12:47:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:47:42.696Z: Cleaning up.
    Oct 23, 2020 12:47:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:47:42.830Z: Stopping worker pool...
    Oct 23, 2020 12:48:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:48:33.636Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 23, 2020 12:48:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-23T00:48:33.688Z: Worker pool stopped.
    Oct 23, 2020 12:48:43 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-22_17_46_07-4659547575342290976 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c87a6d08-85f2-4f2f-af5d-264eed6d644e and timestamp: 2020-10-23T00:48:43.319000000Z:
                     Metric:                    Value:
                   read_time                    18.093
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 23, 2020 12:48:44 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 52.604 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 26s
107 actionable tasks: 75 executed, 32 from cache

Publishing build scan...
https://gradle.com/s/ckebqdmqivno4

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1147

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1147/display/redirect?page=changes>

Changes:

[noreply] [BEAM-7372] Remove py2 checks in setup.py packages (#13138)

[noreply] [BEAM-11101] Fix BigQuery snippets (#13167)


------------------------------------------
[...truncated 273.05 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 22, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 22, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 22, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 22, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 22, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@532945473]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:150)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 22, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 22, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 22, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 22, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 22, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 22, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 22, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-JGEPVejcs4u8alT0J2dZU-swjSRT7jnWzxTm26LTAOg.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-17f4DQ7g4JtIVmgNdNooj--3G7lu--_bJSjeaujG6T4.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-YC0pQtcdYDS1EqiZEW3LhbehpwcI65vERQbmTnneHS0.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-xeQ_PVZK5agXj4JLCq5i7z51r2zOartLkp0pLnpF540.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-JXtkcbVi4mN2zfupIk18qG1sB9Pq85Or8Ze8kpedAg0.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-avHhMURHyHW2xkbaxbmCO1nkaJXLUkppyDt1UhtC_Bk.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-5Ib6IvYZJ9q3lq4uh_2e-e9c2O4okbdiYfe5PGgHc9Y.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-RNKVpiVmvTsb8REEG_OA-bgj4WFMbXhTpllDsMHxOvs.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-tTe3PEyJ0e2aXF-KELfNOO9MmVRdcSRyVAGSVTRD_mY.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-MjGfdt0bzW_cXh9InJ48LeKLbF1pLCA_QMpTxR2Yoyc.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-q9_z_eQ5BUnd_S21Nm2j-6BCtOA-695Px2HVrta9oOk.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-Uzre9ikKfdX6q8ODLPmm2sBYEzRpyIzKdQF8-tLiuyw.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-a4NnoCITwU5p1VWjZtFd8bMeC3qnw6QT-uC7Z9xIzCc.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-z3D1zNBQKQihyukaiZoQGp5CkNMGs-_ug4-p44_EOsM.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Rbtp53lDwQpGXChGdtBJ8OKicOiOE8GCExGG-EMXO9Y.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-JGEPVejcs4u8alT0J2dZU-swjSRT7jnWzxTm26LTAOg.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-JDyeThJBantVjXCuVAkw26nZEjcKKt4r0TAuaQ94nA4.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-0kh-jBRN6s5-vdjnIQSGRRzEcpLBgetPGD7-H3Fq_Yc.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-CuPYsXIIjOca77eGMVNM5l_NUtZhq9WcAmIC-jMenMQ.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-YeMDuWxRoWhxC1eefnGfeBdSt6b6ck23ImJf8LiI2Pw.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8575518156443829499.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test---lOVWAefjMWDVEDLBdIjHBp6q38zeVJidW0AV7e7zI.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-uzAd452mElLvNncb58pIe4hvHOBBgV4S-rmuGxwQxQ4.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-nmX7ooFTpqkf456IRbxzeEvcnKI7CRbk-61VkSALTws.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-SKoLsHt0of6rRtwnSecxV39P1ajvlav8HYHSoDo26co.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-ADHmNyX1UQ82WmDyGZKuhBelIldrpPuYbZ-l4MyrhSA.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-Fdx5729I6IWcE8MppSeLikRZplPirOZI37Y1JK2eMOw.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-Y0UMtsLj3SvBLiPCpMmzooVP4WhQKIoBFYPBw9C-TcU.jar
    Oct 22, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT--PW7OQ1rdHUkZ87CA3L6YpHHegqqAgawyyjITI3L7T0.jar
    Oct 22, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-k8AGaIsZoVDHQOmIeih0EPO0514n1o4awSLR5xZaSaE.jar
    Oct 22, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT--mX6O8j1yyoOBURwRpN9hu6JcIusUEOTgTuSSrugdrY.jar
    Oct 22, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-KlLipVV-hy7wJJNQRNhCW8vXVAWkTMLIHaNv8YYCqXs.jar
    Oct 22, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Oct 22, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 22, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 22, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 22, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 22, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 22, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96013 bytes, hash 285f61ff78b9e738c7fb7448de2203257a5738220973871c0c473ad77dc23c78> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-KF9h_3i55zjH-3RI3iIDJXpXOCIJc4ccDEc6133CPHg.pb
    Oct 22, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 22, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-22_11_45_21-2859248471834963109?project=apache-beam-testing
    Oct 22, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-22_11_45_21-2859248471834963109
    Oct 22, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-22_11_45_21-2859248471834963109
    Oct 22, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-22T18:45:21.076Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 22, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:45:29.492Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 22, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:45:30.294Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 22, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:45:30.337Z: Expanding GroupByKey operations into optimizable parts.
    Oct 22, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:45:30.372Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 22, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:45:30.452Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 22, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:45:30.480Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 22, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:45:30.506Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 22, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:45:30.552Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 22, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:45:31.058Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 22, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:45:31.139Z: Starting 5 workers in us-central1-c...
    Oct 22, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:45:54.124Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 22, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:45:54.153Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 22, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:45:59.448Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 22, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:46:03.411Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 22, 2020 6:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:46:19.673Z: Workers have started successfully.
    Oct 22, 2020 6:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:46:19.708Z: Workers have started successfully.
    Oct 22, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:46:58.785Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 22, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:46:58.989Z: Cleaning up.
    Oct 22, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:46:59.077Z: Stopping worker pool...
    Oct 22, 2020 6:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:47:50.189Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 22, 2020 6:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T18:47:50.241Z: Worker pool stopped.
    Oct 22, 2020 6:47:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-22_11_45_21-2859248471834963109 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 260e5906-6478-478a-80fd-7f2ff7b8cc81 and timestamp: 2020-10-22T18:47:55.908000000Z:
                     Metric:                    Value:
                   read_time                    19.206
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 22, 2020 6:47:56 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.052 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 52.048 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 41s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/oplawzc534rn2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1146

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1146/display/redirect>

Changes:


------------------------------------------
[...truncated 271.44 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 22, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 22, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 22, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 22, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 22, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:150)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 22, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 22, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 22, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 22, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 22, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 22, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 22, 2020 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-D0mRoo_cdNQsA8X3YuPkhWZM3PiANbrrt7SChu7ZDh8.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-h_2kPPpJqYyjoEZ9afLx4mRNpIOwHkNtPZTiExJYMbs.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-zSEhEZYmkdYNPxqQW5qkRcgU3tTuQ14fJN-duvNkLKc.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-Nc4SFc-LVTpIXk2zU9zatWvmABeIR5eJauoEsbvUkRM.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-jwYrgHJKk8TnHQ-UX_HMXF-YWE7hbDwNY7HnXMSARR8.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-dmHIi0c5AvNBGrrhgUX0dqPigmgFrq53t3Zk9AjLnQQ.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-xf57b2eaVixv_M9vS0oCBv2ZYOnJtg6hTHmcDsOKt9U.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-y7_XLmeFmM5xXx00IQM6u8-2GeXizBFcO8qyDi6HoLg.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-TwPHPtkXWp0-5qe7n5dTjT-WTVWRMpADg_gKNWPNFuk.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6349508519468142431.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Q5Q4UT_Mt2deRjVw7BMUVvOUA_o0F0E5ExFoiismDtI.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-yIRhTf71-JxDy1a0M8_la7A9aG45GvOfjaTmYe_bRzI.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-bZRUga_BV2BWOCke1-Vqy6Hy0Xj7yUjN2hw3cPp0pSI.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-7nnrYduPFNdZCVOxjg6h8vMAAvBN3imMEYjYIP4X8G8.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-AbWr4zWvBu7tK_v-SsUMNsCVTIyZzQzIyWLBUuGvUQQ.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-Vvhf2OBhqALDLLJe1vJPAI5E1sIvEumCG_h-vynHKZQ.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-9GgNxLpmMYq6dXankSeipGb74-7VK5oqpVQEs_W6n2c.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-IDacYrZOBeL7v5gRYjJEcBNo86vJgRv_9eY22L4pa1k.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-uneVKgtkEhwObDqLPhfZ3ilPT0UEaTIM3habiO0WDNE.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-7tQKujcjhPq6N9nyZ04DU0NCTzBEhPANwwrhlHUDzm8.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-NiFYFd4dv8D_7oVVJV3Q4QiMP_vMOQKUuEwvdmTdKLA.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-sGdB8PDJ3zFi8joCI3E1xIBpb2nQALhAVXutbgq0wFM.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-gRanBcrOycqJu_d43FlWXamhhshKoTwN2EmaaNy_lG8.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-RfSjhfSCixb1ob0a7MhuqijAxFbPfplhVnk8devpjzk.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-muKub3TWGsoo9VkRf4RXhhKhoXtqksLnh7caqEDvQp8.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-QAxUUnUsP2BApzgg2kGgzdAwWCZ58mvbHDHqWM251DA.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-Gm4bJB2UEhv8u4No26MgIMl1GhKoXCHImLOGc63iIz0.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-5AFfdTlENf3ZKkU1KMExWga_gbpq8LexFpDsTs8sRcA.jar
    Oct 22, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-D0mRoo_cdNQsA8X3YuPkhWZM3PiANbrrt7SChu7ZDh8.jar
    Oct 22, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-CRgQJgpSH0PadyzfxnOPtZUper1p4iMZNI5XrYiT9bU.jar
    Oct 22, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-8UOVpEe6M2a7chLSYUFT7a_rW3m8apX-mDcfpNJJgwg.jar
    Oct 22, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-CFLe0Rv_rbOmi3xa1XPDFg4mUfhc2kqYy0iKVI_LMgk.jar
    Oct 22, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 22, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 22, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96015 bytes, hash a4b45084d0cdce58f7e8ed49d69f3107d42dd9839cfac118377ff983a934c026> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-pLRQhNDNzlj36O1J1p8xB9Qt2YOc-sEYN3_5g6k0wCY.pb
    Oct 22, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 22, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-22_05_45_16-8826816439049501552?project=apache-beam-testing
    Oct 22, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-22_05_45_16-8826816439049501552
    Oct 22, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-22_05_45_16-8826816439049501552
    Oct 22, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-22T12:45:16.395Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 22, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:45:26.854Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 22, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:45:27.738Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 22, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:45:27.889Z: Expanding GroupByKey operations into optimizable parts.
    Oct 22, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:45:27.968Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 22, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:45:28.062Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 22, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:45:28.099Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 22, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:45:28.131Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 22, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:45:28.166Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 22, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:45:28.595Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 22, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:45:28.726Z: Starting 5 workers in us-central1-c...
    Oct 22, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:45:35.050Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 22, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:45:51.827Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 22, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:45:51.864Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 22, 2020 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:46:12.369Z: Workers have started successfully.
    Oct 22, 2020 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:46:12.427Z: Workers have started successfully.
    Oct 22, 2020 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:46:23.521Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 22, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:46:52.807Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 22, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:46:52.985Z: Cleaning up.
    Oct 22, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:46:53.092Z: Stopping worker pool...
    Oct 22, 2020 12:48:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:48:03.308Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 22, 2020 12:48:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T12:48:03.390Z: Worker pool stopped.
    Oct 22, 2020 12:48:12 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-22_05_45_16-8826816439049501552 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e24c3a09-b490-4c4b-9a63-693aaf9de564 and timestamp: 2020-10-22T12:48:12.584000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                      20.4

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 22, 2020 12:48:13 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 3 mins 9.394 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 57s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/jfgmr6odwxa5s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1145

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1145/display/redirect?page=changes>

Changes:

[Chad Dombrova] [BEAM-7746] Fix typing in runners

[daniel.o.programmer] [BEAM-11074] Improve process for adding GPG keys to KEYS.

[daniel.o.programmer] [BEAM-11074] Removing manual steps for GPG keys.

[sjvanrossum] Update Fastjson to 1.2.69


------------------------------------------
[...truncated 276.72 KB...]
    INFO: BigQuery method is set to: DEFAULT
    Oct 22, 2020 6:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:150)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 22, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 22, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-jBJkSvPw3rzsFYdfxBAHM5OUpq3jg64nEkSy4ZDE79U.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-jBJkSvPw3rzsFYdfxBAHM5OUpq3jg64nEkSy4ZDE79U.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-gX5Wab49sV4ELNf55fvlNolAaEyJrp945iZmTlydwx4.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-A1DGJYSDYbHaIilvMRSOsu5ovVaGT-W8UkhvOQWCMG0.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-D3qF4DCYowLD6FlYfQvOFnlqDTpX4fSvIjElOE1EPIg.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-5HooP5vCWvBhMFm-yBKwBCfMDwd9AZ8lqDGfycJo3CY.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests--3H6XYcykPJ18lrNo24p7a8Ko6b_KaoJpfDKFOZUJlA.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-no6om936YuFT-l8yvXP1joF1PIrgF8xUGtzel2Rl-SI.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-hGKu2EzdPD076pxYkJD1VoDeM__OyC3WVRDLz1HoROY.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-WSREm5VUtWkmFDkedS7JoyePYZhMkuppi4yrJSWwXnY.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-eu3coeDzC3OvLmyik03aKmcdL2Kxnww-3PlbZ_YAsPQ.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-Wk6LfQu81tpWqXXv7sa9Z9Ha6YnlaoyZJgaJa7Qp8Vc.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-jEnRk8AR0Cf7xRxmpiaBnDk7YmmbQyfH9M5Xx61DYK0.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-PLEBsjqAqKm9Y41rLGh01c5DVzYyPNhoU0eHNRWaYH8.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-mb7Bd_vlD0jzv2dj55DfT1Bh4affAjbQPGP9GLJkdXI.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-tvzJnsaVNjynXFLV7Xtjl66yjXEVZjCwCC4hxGXq5dM.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-qzQW7ROmKT4YRtCAXIF7gk15y8RZValXtUSB1jEBeGc.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-LciFr3hQFW4xHACQlw0Z6rOpPn6eTMtSMKH3JO_OGIE.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-mddw8xIIGLqEIxIcubVP-XGYjHi59pnRq7z1e7JvNCA.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-HLNZDoSKK7y3TL_xY0KY9ml0vNF9ujPfzs92ZG21HM4.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-rKBmW6wVNKZhtAVDeUDM2EQFQHtokcGVQHyTI8YS2gc.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-CBFBhIS8em91Lrh9riMYsr52wLhuvfif4irmFkFfh80.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-xQiIk1BKnlDGmzFYVLVrPMlXk78e16LyixoeIzkICxI.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-KpnzlzW-gylOtDKBE_QO5gOqbClnoSm3do1b473s2RA.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-vKuNHn2z47mcgAvOavRj6Rp_mt8LmG36k5TYMhfJ8ak.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9186714746479055330.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-B9FT0jD3jrdknheOvPLasOlMAfYdj3rBlHHVoJztGx8.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-4TVfCamgiYNJEuiaKPT-ZX4SorBAVvZSi-AqWD7-1Os.jar
    Oct 22, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-rHElKSR-cJriqJ1wDS5nOudgNOQpG-JPlm_OOEhZ8uo.jar
    Oct 22, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-2lZYLoDg7FK0EKYDoOi8qt8cJ7eYummEx9sgc13tbWQ.jar
    Oct 22, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-kVPDa8MKrnW0bbdbdXs27TDX7rFvoPpZko7PuNTMyUM.jar
    Oct 22, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-cA2hoxU9rjnC3qC7XlnKIm2Hdk80SrCeLLTDkOCgYYI.jar
    Oct 22, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.69/6cb063f1d527ff65bdbb9ea74888a5ffc3f92197/fastjson-1.2.69.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.69-KniRdEoDeOAJmpuB3b7U0uGn6GMvb4b4_8-jg-UeScg.jar
    Oct 22, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 31 files newly uploaded in 2 seconds
    Oct 22, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 22, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 22, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 22, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 22, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 22, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96015 bytes, hash 64c8197444d473b59a29fde98fa786f3c5da88c0399788e70fe818214bbeb5be> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ZMgZdETUc7WaKf3pj6eG88XaiMA5l4jnD-gYIUu-tb4.pb
    Oct 22, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 22, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-21_23_45_38-18405670671262176517?project=apache-beam-testing
    Oct 22, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-21_23_45_38-18405670671262176517
    Oct 22, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-21_23_45_38-18405670671262176517
    Oct 22, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-22T06:45:38.470Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 22, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:45:48.340Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 22, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:45:49.067Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 22, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:45:49.227Z: Expanding GroupByKey operations into optimizable parts.
    Oct 22, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:45:49.295Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 22, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:45:49.499Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 22, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:45:49.597Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 22, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:45:49.672Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 22, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:45:49.796Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 22, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:45:50.439Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 22, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:45:50.522Z: Starting 5 workers in us-central1-b...
    Oct 22, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:46:12.566Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 22, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:46:17.036Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 22, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:46:17.069Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 22, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:46:22.310Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 22, 2020 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:46:39.627Z: Workers have started successfully.
    Oct 22, 2020 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:46:39.662Z: Workers have started successfully.
    Oct 22, 2020 6:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:47:12.578Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 22, 2020 6:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:47:13.002Z: Cleaning up.
    Oct 22, 2020 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:47:13.223Z: Stopping worker pool...
    Oct 22, 2020 6:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:48:01.456Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 22, 2020 6:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T06:48:01.509Z: Worker pool stopped.
    Oct 22, 2020 6:48:08 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-21_23_45_38-18405670671262176517 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e6ae12f3-0cca-4ecf-9b49-52759ed72cb5 and timestamp: 2020-10-22T06:48:08.109000000Z:
                     Metric:                    Value:
                   read_time                    13.404
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 22, 2020 6:48:08 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 44.921 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 52s
107 actionable tasks: 70 executed, 37 from cache

Publishing build scan...
https://gradle.com/s/jxna77eibq7vo

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1144

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1144/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Upgrade checkerframework to 3.7.0

[Kenneth Knowles] Upgrade checkerframework gradle plugin to 0.5.11

[noreply] Merge pull request #13153 from [BEAM-11093] Adding BQ snippets for

[noreply] [BEAM-11094] Only work with fully expanded PCollections in

[noreply] Use a newer Python interpreter version as default. (#13140)

[noreply] Delete unneeded PCollections in pipeline_from_stages() (#13100)


------------------------------------------
[...truncated 278.34 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 22, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 22, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 22, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 22, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 22, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1214853303]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:150)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 22, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 22, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 22, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 22, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 22, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 22, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 22, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 22, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-4JJbfzKBu89IW6_qGd4435o6JnLxOVxuuQde7xh9QtM.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-9lBgqJ-v_n7mmBjGAxHH-9ruJk57oJiwsTAtNux7eS0.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-ca8Xaw69dX8FLUqGkkWvlG7pWMEPxQo-7opVDThJDF8.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-Lo-o8tHNfXEHynA3TXClbgSvJ727Wegh_3gA1B5oCy0.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-PlFWVq9DbfkfowhSL9t5Q9FZ3OFSXObvq0SBfqDKLpQ.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-jiU5rAucipYZXBrMM3vj0c2krjmQVDCvkn73dkdprrw.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-xL_r-YW6jP96qbBADejnNdO1JbyhvQzMRtE9TzjIOgI.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-kIVqxokKnjrGVpPRfCwDaNWaFXofyoRPp_uqtvFc2VQ.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-5FXHwIewmQPLbYTQIwL2TZnwejMj7zBMGL9yDTQlirI.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-gfxWblfA3kHDJdzTdBTlpJZlHGVTSPVv1uvW5MQhfRw.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test268997125544451033.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test--w242DgtEQD_8jc43GGHO7Pnv16SlIx2RiK0ilwY99Q.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-XwaX2OLpIB_DXF4D0oTKDgMr0gGuw0jw-5xCr34yrck.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-iwu4zBLva0lbmnPweGjmYyZNWjPGRKBmFBuoRkId7xA.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-ci74hub1so7VXgWLgp82ad74iFewxysVaSa00RJp7yo.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Z0dYG2WeLH4HTaqZ9NMMedSFXtdiEGAYzwK_CSM1ZrQ.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-RBh4DgEpG7MIPiMdbC87_LqepDOAQPfR3xdgsDapG1Y.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-VAPfEtXOmAgrAJ2taMRFrDwQicO1SZYxWdLL8wJS-0k.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-fuvCzqToMUtrNTEH80VGz-nAsUoBBti2cb7BzO5pu6U.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-4JJbfzKBu89IW6_qGd4435o6JnLxOVxuuQde7xh9QtM.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-BYWvGENgoEzH8587zjkMILi5SH1MUBGRwDBYMcDZV-U.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-0HvJruWgCfr0ZQAS5onxyUlglU06qIivz7URWB9mzNM.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-5anxSoUxjEGcWr6f2t47-ml8DGrLRT0Qq9OAvVXzC4I.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-mCB0NFjuHZtPgs9xJuFTWQNfSiYNXs_x-jhtxsG4PyY.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-I-U5XEklotLWiGKpCfRSqoxrw7JxVUSqkkauVr6FYTA.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-g_IXTSbCi4Gih8sjF9PwL4QPipfKTaAZVJdqU0UZ5Ts.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-FLnp86qJKDkoTTh3_XRHiCrBhwzPRjcxHVUkAD3Xkbk.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-UFD8lgzQBadt8XXDftjYO23POBVuei507ex635GQcyI.jar
    Oct 22, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-Y0gCjire59Oes54U6yi9uw04FF8og5tHLxGZlR1XMzk.jar
    Oct 22, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-sB3JKb9aikalD__WLTmlPnxEFEEtkbYocL0ult4ddMY.jar
    Oct 22, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-t-OkVwZN_u8myUPHwjLeSuWtfAFVdoSzJHciJEfUVm0.jar
    Oct 22, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-vEBKEPkT3Ed_zlRSl9xrawzong_pHm86HjYGKcca7yI.jar
    Oct 22, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Oct 22, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 22, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 22, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 22, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 22, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 22, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96014 bytes, hash 2c3afcdcaf2451ff40688c1bfaf6ccbd683469cd1dc9e37613ebff21100daa87> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-LDr83K8kUf9AaIwb-vbMvWg0ac0dyeN2E-v_IRANqoc.pb
    Oct 22, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 22, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-21_17_45_42-17141013393565979484?project=apache-beam-testing
    Oct 22, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-21_17_45_42-17141013393565979484
    Oct 22, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-21_17_45_42-17141013393565979484
    Oct 22, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-22T00:45:43.021Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:45:50.057Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:45:50.777Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:45:50.814Z: Expanding GroupByKey operations into optimizable parts.
    Oct 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:45:50.842Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:45:50.925Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:45:50.962Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:45:50.984Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:45:51.020Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:45:51.401Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:45:51.497Z: Starting 5 workers in us-central1-c...
    Oct 22, 2020 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:46:18.630Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 22, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:46:20.106Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 22, 2020 12:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:46:42.156Z: Workers have started successfully.
    Oct 22, 2020 12:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:46:42.188Z: Workers have started successfully.
    Oct 22, 2020 12:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:47:18.227Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 22, 2020 12:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:47:18.368Z: Cleaning up.
    Oct 22, 2020 12:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:47:18.456Z: Stopping worker pool...
    Oct 22, 2020 12:48:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:48:11.780Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 22, 2020 12:48:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-22T00:48:11.832Z: Worker pool stopped.
    Oct 22, 2020 12:48:16 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-21_17_45_42-17141013393565979484 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3aed039d-0749-4c95-b0ea-306038089a81 and timestamp: 2020-10-22T00:48:16.879000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     14.64

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 22, 2020 12:48:17 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 48.272 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 1s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/gfjiz3ktqgg24

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1143

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1143/display/redirect?page=changes>

Changes:

[neville.lyh] [BEAM-10787] make external sorter 2048MB memory limit Hadoop only

[Kenneth Knowles] Make streaming and "auto" Dataflow ValidatesRunner jobs have different

[sjvanrossum] Change imports of org.codehaus.jackson to com.fasterxml.jackson

[Boyuan Zhang] Lengthprefix any input coder for an ProcessBundleDescriptor.


------------------------------------------
[...truncated 284.40 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 21, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 21, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 21, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 21, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 21, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 21, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 21, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 21, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 21, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 6:46:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 21, 2020 6:46:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 21, 2020 6:46:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 21, 2020 6:46:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-rx1nbsia2WOulO-eeWTPBVIE6zUPZWZUcI6LlZod4tg.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-kigREHRaJWhsite3cTSotg4IMdJqRFT-8UiX192oSYg.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-uW-Ah9lUj58mhT89TOyyyNEigA4u61_q0Mz0mSk0ad8.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-rx1nbsia2WOulO-eeWTPBVIE6zUPZWZUcI6LlZod4tg.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-wSv8Rp5w3bACggt1Rf08nZ5JliuFr7Z66sBZhSSGaVo.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-uLW8HNmbvfwkpwRb_2DAikSez4wDAhFZspI0QtHOmXk.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-RsSS2J4LTP7pFxVWeiE2mUC8xAdZgPgTloHuPV-YW5k.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-TzcAre6LojIKJuBbGZ-CFnY_QFM2nt_G6GOANSTNaZI.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-hiZUXTc5fetjIiHgNmWIh5gLB_DUX-JhNYPOYWJxuSc.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-qbvIUpECcnXf5CQZF0ACrnXPqu5D5eyeQKQBcBp1a2E.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-rwmHrUox6MrLf6M7Ddno4g_cdM2E-B2qp79MO2LPkjU.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-vWbsNQg0yQQjkPCQ4mrvNbMf4GXYosf4pYI61B2DXAc.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-WbxSHDMg1fCFZeD-OdzO1gCZjMOulP2DfdBGWIlJRj8.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-X2KRLzgQ4lwYIIdmQSMKhAYOaVtA2cQHEe94mI3zGBE.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Aa-MVqmR03Un1d0QsPnn8Y3TkiCRlaHw7ug_DfFOais.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-8_qJnp4eUZcK3_TPHvaXVyR3hJsgkFppefqiHR7XZYA.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-qz2_7tVfo5XJ5G8ie3_odv0FfHccsRvpP2K-G5xx4hA.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6161024474663349951.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-GsaschyKo8HlLZyyneJoTA-33WEECqps_oZuQKvHGvk.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-GvGn10eA_JJJ0DP1NlO8fMz97XT1wNZFrkcZtMBNVZY.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-2VSpyi_IfnTeqE1Oc_fbRBU4TNv1nNZ2FS2uZSo0NxY.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-GCZHiMZALkt5OVQ7M0DKs-FflWM58We1mx7H_4hNZ-M.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-2g8nTb_Qpp_bfxMV7ROPxpkBewM2uQgvPU-h2Hu65Ek.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-BtymOhflnlKAzIXNE3FlLsEHAcWjPHsAAuSdwSCiQno.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-KTeH6e3C4H28zPOEiM33vmIQGnpWlfjg4t2Xh61TCfM.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-4knq5uvgzD_hDLQkv833ouYpHt1NZbjmDw-ygZAajjY.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-H94ZFhTflc5-69pgdvA-9T0dliyLubZlxk9Z17UGYI4.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-woUb_0veJhdDzcuVB4UA4gz1XLuJWwCx5XJpqMCy00E.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-12uuxllP-0qgXiSkErwevQ-1xbtpykiMoLSEb4mQGhc.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-GZzdKi5bBsBdHd4kziFLptlP2o453NmI4jGQsbXN2WQ.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-cNSlrRcYkCgnqOk3P6p8LhqQVioJuc5Fah_sJwhxouI.jar
    Oct 21, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-b0vSf2_51O-PSBmkUqUjTRTCnLprF6b0YqltDJ8rNEc.jar
    Oct 21, 2020 6:46:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 21, 2020 6:46:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 21, 2020 6:46:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 21, 2020 6:46:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 21, 2020 6:46:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 21, 2020 6:46:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 21, 2020 6:46:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96015 bytes, hash 2fd88dca3897494267cc82e5895caa1f18f283d1297b0c793cdde8a6b87606a1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-L9iNyjiXSUJnzILliVyqHxjyg9Epewx5PN3oprh2BqE.pb
    Oct 21, 2020 6:46:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 21, 2020 6:46:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-21_11_46_25-4748032645771089531?project=apache-beam-testing
    Oct 21, 2020 6:46:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-21_11_46_25-4748032645771089531
    Oct 21, 2020 6:46:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-21_11_46_25-4748032645771089531
    Oct 21, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-21T18:46:25.684Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 21, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:46:34.251Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 21, 2020 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:46:35.055Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 21, 2020 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:46:35.097Z: Expanding GroupByKey operations into optimizable parts.
    Oct 21, 2020 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:46:35.126Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 21, 2020 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:46:35.202Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 21, 2020 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:46:35.247Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 21, 2020 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:46:35.283Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 21, 2020 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:46:35.310Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 21, 2020 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:46:35.739Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 21, 2020 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:46:35.813Z: Starting 5 workers in us-central1-c...
    Oct 21, 2020 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:46:51.680Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 21, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:46:57.818Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 21, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:47:20.059Z: Workers have started successfully.
    Oct 21, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:47:20.096Z: Workers have started successfully.
    Oct 21, 2020 6:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:47:53.962Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 21, 2020 6:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:47:54.119Z: Cleaning up.
    Oct 21, 2020 6:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:47:54.220Z: Stopping worker pool...
    Oct 21, 2020 6:48:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:48:52.769Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 21, 2020 6:48:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T18:48:52.806Z: Worker pool stopped.
    Oct 21, 2020 6:49:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-21_11_46_25-4748032645771089531 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 99902d76-5d4e-4a25-b119-c64c7f7b81f5 and timestamp: 2020-10-21T18:49:00.237000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     14.22

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 21, 2020 6:49:00 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 51.345 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 45s
107 actionable tasks: 79 executed, 28 from cache

Publishing build scan...
https://gradle.com/s/l6ntu2vhhjzco

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1142

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1142/display/redirect?page=changes>

Changes:

[heejong] [BEAM-11018] Use metric for Python BigQuery streaming insert API latency

[heejong] move histogram to internal.metrics

[heejong] add missing __init__.py

[heejong] fix error

[heejong] fix docstring error

[heejong] fix test error

[heejong] modify the comment

[noreply] [BEAM-11075] Customization of key size and value size (#13149)


------------------------------------------
[...truncated 270.36 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 21, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 21, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 21, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 21, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 21, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 21, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 21, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 21, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 21, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 21, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 21, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 21, 2020 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-7pzA9SaS-TxHn1gjEzFmWkRrXdVfsGrqs5yj1z-yFyA.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-1hS_ga_ZDnkZ9K7j3spKrnkAk1CCYW5OIieFLD1QBdQ.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-Nv7Q5FcASIlys5KrNs2ntwVSmnuaHbZNSLbnPa4Q7vs.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Uflb5akWh8sd8Enl0mGuUUMsY4KYSZ4lEJt-Nlr8Hj0.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7845300060175774694.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-fTlQDZz1X4bkujxnriraPU_7u9X2R5cXM_wgHpD7y4g.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-ooiwNmCnGXgMyP5GIIOfXGDqHkShzzqZ8t76FHmFj_M.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-mDS9YP75vTwvWjkVmqM0VeMxXr5_Z_N3bOHx5iK4OSQ.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-mXA00Nj95fZA8fqVva3VV6Ra6atZq1QwNP9muoj0mnU.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Yw6SW1jLL2wC6hm3MTyHHgFHVXcTV_dcEBOUsIDbRhU.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-NSIBckyl6Azs23Azq5P5auOOa_YDiXWct2qha-7DFtk.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-RymFiHycB1P1gHCBG5jSNQIWh6N4lIbAlMFfjbeebqU.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-WjdK457qwqJ1BocF2mfwq2NohjSKkMccC69QlUk-7u0.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-bih-qfkWJ6aGS4vqhKgzXrkx1ENzCZZ7f4w7mxGI64I.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-kz_Y9nBkdr_UzdatJxa1da4x6Lpamy_P_RTn3xfL99k.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-u_oUp5YHWCGIbCaaPWgLYDru4dgjv_Ws9dFOtGQcW_I.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-AKym2YS3yV8d2-4F_pgjoOWVC12QuNsr83hs2QX9aPM.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-in_0HhhaoVlnWFDSkg2NAF4zGDPugeq5oa0CADtF2io.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-acRMXv2C2v4kNosdlxEfQedknX1MajKQIZocmMdjwrQ.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-xlsOmRBiKMbPYSAld8rTRmr_syYxZQDt4wr3hN4ENgg.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-qXdsC4OY4fd35bYlL_9msBZmDrUz5aVXRR-6Dd172LQ.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-a0BXyljjCMAtzw0UYxnrI26XFQ1um6ejfbZJZSGhZmo.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-rqkUdQ5hUMx_o4z5F_JLJTky2lT1LErLEKpN7P8hPV4.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-uTnOQi8eSJUO719my5N-vVHmEEQR6ehGNieyDShAPgI.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-k9-yUk0-CHVJTiM3v9i_dvv8n3OZbgSELsYBa_xefWA.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-7pzA9SaS-TxHn1gjEzFmWkRrXdVfsGrqs5yj1z-yFyA.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-k2nTCsPGmu-Khy6Fr5UbtyzZBnZYnkrZXgY_58_uB7Q.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-jxTu0HxiZJGbDjIKCakMku5mgdUO6Qy54gD5FuOSA78.jar
    Oct 21, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-43BIneub5tvF_Y9ihaX96mh05s0B2oIESuzYr2cUfcY.jar
    Oct 21, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-C4-7UwNGGBg3JuwunKPvGwFI3b5m_B7qZQiDcuTQfsM.jar
    Oct 21, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-uxP9PJ7VgKHpQ3MU93pP1mrGhRVREFHDpkSYXX981y0.jar
    Oct 21, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-KEkE-_eh7P8gEyvPRXkl9ex85ARlX1HDsajnXTIU2pQ.jar
    Oct 21, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Oct 21, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 21, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 21, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 21, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 21, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 21, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96015 bytes, hash 317d6f836bdf8854292566e3c5ac70bc65fb023af90e9b1216fd0a1fed6679b0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-MX1vg2vfiFQpJWbjxaxwvGX7Ajr5DpsSFv0KH-1mebA.pb
    Oct 21, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 21, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-21_05_45_15-4605555505847449641?project=apache-beam-testing
    Oct 21, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-21_05_45_15-4605555505847449641
    Oct 21, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-21_05_45_15-4605555505847449641
    Oct 21, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-21T12:45:15.999Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:45:24.041Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:45:24.755Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:45:24.808Z: Expanding GroupByKey operations into optimizable parts.
    Oct 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:45:24.854Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:45:24.941Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:45:24.975Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:45:25.013Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:45:25.057Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:45:25.943Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:45:26.035Z: Starting 5 workers in us-central1-c...
    Oct 21, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:45:42.466Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 21, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:45:54.527Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 21, 2020 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:46:14.037Z: Workers have started successfully.
    Oct 21, 2020 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:46:14.111Z: Workers have started successfully.
    Oct 21, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:46:51.292Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 21, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:46:51.445Z: Cleaning up.
    Oct 21, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:46:51.535Z: Stopping worker pool...
    Oct 21, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:47:42.106Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 21, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T12:47:42.229Z: Worker pool stopped.
    Oct 21, 2020 12:47:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-21_05_45_15-4605555505847449641 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): dcfd6be5-b746-4839-9796-3520182ed431 and timestamp: 2020-10-21T12:47:50.259000000Z:
                     Metric:                    Value:
                   read_time                    17.823
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 21, 2020 12:47:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 47.502 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/nk4gdsegkkymc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1141

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1141/display/redirect?page=changes>

Changes:

[noreply] Correct docstrings in existing version of ContextualTextIO (#13146)


------------------------------------------
[...truncated 271.50 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 21, 2020 6:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 21, 2020 6:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 21, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 21, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 21, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 21, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 21, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 21, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 21, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 21, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 21, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 21, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT---1ilnEgE1cle5lFUPTNhJVpoJO5M0Kt8fxRyekAV50.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-U6z3PAUFyw45QzZimPADgVHRGWqOevg_OqhlJ8VmikU.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-f7F1nljjS7nKlSdFc3OA0hqbtFQb1pzw4ko_WvMw0g8.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-4IMkCOS59CXU9g_4c9lHPZP1s3yXHT-lfLx0yaCOa9A.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-FHdWTHxd0K5CnLYp6V0l93MUUJcLkeEzBwRHWw4T9QU.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-5F8rxjn3_vMt9yqWAKnLMr43VGHA1FTqlp1zW_uyz4Q.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-7_I8oMmI_mfNLTL8X2sBIeP9MM9aPQoW29TgmyNmfoQ.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-3jeEI5xPyxKrd99C5zSpFMhYjRCulnkDBwzxcKI-r1M.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-d7jNRobfuQIqGbp3bHV-vIl57OCD52w_FHhH3h6i_X8.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-f3Ed-zCLVE4LEQDu4MEDQZGC865UxAAl0408E2HwE7s.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-STzMyZcYz8rAxnohH9D7jdjwu8oOpc1uR7uxv5WgAzI.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-54LOYMRFx6uti2xjtljHguK4ZLeuMxA2_BX_DYMFhNw.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-yr08dYLnNatwLJCCqrO0M3MqWsRQN865jTFzVLA4vyo.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-BOXHc6ztk5lf__PcT9DwbyQi_VjEuzV303RCetW2ZLo.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-KffaJX17W-njKqAuEc_wz1aHZgPDxK9X0aXg1P6rUy4.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-LXMMuWJGjuy3FZ8zYge9Q9gPZIOhOPF1LOTvxqYFwLc.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-epF0Vw036lagZuxYLfZgXCYv6rtn0KAKTIFtdn5wFbU.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-JrK92qDP_pKoIfGVvPCbixbFlduEPHW4uu-7Q_RIzxI.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-XKtWdJPhC-uCoqhvZC-nLsnb1sCJ5p4DGSYxFIPsHxg.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-u4izwsNVAHkxjPNAOaHk0AzBzvYMOyaVvRHsbTqpfiE.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-L1jXxnSNyBCKiRnJncP3gyWFBoctNdOAJJfrurna8NQ.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-5pEvnWI3xwEFgrb7Ci3SemqOUoqpxtjtC1uxeNy64DQ.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-qb1gja51XixloZzopeZBsAPk61rUXDYgvsDDAweZbTA.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-8l5ZO8eCtvebLSYGMFc64sPRLtENX1Sa1CfTMoKK1lE.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1374900208404473373.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qo5SF6gO6DAdZ-z4IbIBFh7kOr7ZESdPGRCSGu9j2Io.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT---1ilnEgE1cle5lFUPTNhJVpoJO5M0Kt8fxRyekAV50.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-8HDB33phwn9uht7cw9OH8B-OoIKBa3uHdOPwS1gBe2c.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-FvVaZJURO2c2s5gqTp1a6b0eyUF1tdMUnp74Xr0vkDE.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-KIWFhpwNlAU0a9vWO7ZSA9n-G--q2WSTyr-csj5b46I.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT--hHidUVITQG8r6-9W50qJzl4FR1ElPhPY7tbwbLDZPg.jar
    Oct 21, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-RPydGav8wspx2yAZaWDq0RMebiHsg2BOAveo5_lA8D0.jar
    Oct 21, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 21, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 21, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 21, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 21, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 21, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 21, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96013 bytes, hash 5248f319d81db1461ae90ea75bb78bec2bc6551e5780c93712406f40195455d3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-UkjzGdgdsUYa6Q6nW7eL7CvGVR5XgMk3EkBvQBlUVdM.pb
    Oct 21, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 21, 2020 6:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-20_23_45_50-1195019923380058647?project=apache-beam-testing
    Oct 21, 2020 6:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-20_23_45_50-1195019923380058647
    Oct 21, 2020 6:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-20_23_45_50-1195019923380058647
    Oct 21, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-21T06:45:50.083Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 21, 2020 6:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:45:58.484Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 21, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:45:59.099Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 21, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:45:59.154Z: Expanding GroupByKey operations into optimizable parts.
    Oct 21, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:45:59.187Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 21, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:45:59.273Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 21, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:45:59.307Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 21, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:45:59.340Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 21, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:45:59.373Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 21, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:45:59.916Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 21, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:45:59.993Z: Starting 5 workers in us-central1-f...
    Oct 21, 2020 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:46:18.309Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 21, 2020 6:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:46:26.380Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 21, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:46:49.424Z: Workers have started successfully.
    Oct 21, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:46:49.469Z: Workers have started successfully.
    Oct 21, 2020 6:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:47:23.316Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 21, 2020 6:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:47:23.556Z: Cleaning up.
    Oct 21, 2020 6:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:47:23.644Z: Stopping worker pool...
    Oct 21, 2020 6:48:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:48:14.122Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 21, 2020 6:48:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T06:48:14.177Z: Worker pool stopped.
    Oct 21, 2020 6:48:21 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-20_23_45_50-1195019923380058647 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9bcce506-bbfc-401b-8e83-bdb0ef38623a and timestamp: 2020-10-21T06:48:21.137000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.073

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 21, 2020 6:48:21 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 47.083 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 35s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/f4jzenzltlam4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1140

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1140/display/redirect?page=changes>

Changes:

[sjvanrossum] Fix Jackson being imported from both org.codehaus.jackson (old) and


------------------------------------------
[...truncated 273.47 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 21, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 21, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 21, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 21, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 21, 2020 12:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 21, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 21, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 21, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 21, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 21, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 21, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 21, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 21, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 21, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-zXD2EwSB-gQBBfMKov6ISElsZu0hwhh6VfWDQKc3ULI.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-CNw5ozJScRVHNbAofFA6M4BvakujwEER6qmFhH_UVTw.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-AdP94FJa_ROE8KNrOxMjr-r3ylAH8tYqLlRd6euiRyg.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-5qkyDgxEucepGmUNtNkZAq6jJX-zGhjPTdA6EAaqhBU.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-XOkXH08PDnlS09wmDpcC4vLWbe0aSy8eoMtcZiAyrDg.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-e4CT-uCunK8t4Gnr3ZleQUAXX6J8_rYreBLNWKgKE0Y.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-nxq9XggbR7d7vfpY0CFnspOJF6S7ruV5vuSJNFET7X0.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-RV4xjZCTSt5eSV9HJWgHDswHwc-FyvtNagcNg-pIBZI.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT--SbPkzby1-FtnC01GnfjZic4Wk6XBdkIt8aWI9BmHNg.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4116147286636457185.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-fZyx191RMZzqetTJpZ-HCm007iJc-jddD2Wvz7CZVN8.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-pExiMi9KL-mz-a1_4i9Guw5Vf0rrz_xr1w9EHfRfCI4.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-WMRs1ikshdPkF0Zi8DjinbXvNe_YWAg-FCfNBItNN8w.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-fUr9BR55oeoOXKIm3aGfyFrsd5b2hsYQz0Dw_8nLQ5w.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-wBer_ZMZLVc48A5AKI0y3pF8SxrSbLm2m4LC-j15Enk.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-K6J0n7akC7WGHNlQKUREwZJweVPu_SX7jo7bHy2e8Uw.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-19mt0X4AjeL5OOI_qSiPcAjsA15b4_HN-UXhkyK3zT4.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-DA5vdpfJbIv6rqzhY8sU_3Nn3ZKPpIWkZD8AD2quyOo.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-FOlG5-YVBJa4BbaZE92IXdGbMlQ8X0a_aGPZuePQHws.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests--W3JLYeeYsZxYfRig8b3cfUwcGMe_jsDSr8J_kcFiPI.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-zC9rG81l_RIHnO_aCWdpvS9lYjtWCL0lZ47kLb-K_54.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-j0KwDV7qPFg1j2Dwnj6peqROgnNGBD-raXb1prJ-s0U.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-zXD2EwSB-gQBBfMKov6ISElsZu0hwhh6VfWDQKc3ULI.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-zUXo5jTGfDR3aCv9cm7QEfwRTnnQPx9WAxZ9dgLPqdw.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-AcPmcPu1OyrDfnWtr0nvOwZyG1qa2DC-dVL2kz19tYE.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-J5kULZIbQJuqpFKRoS_nGha5E92PYIvdn3qVo8ltplw.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-z74qcnNd3NEPA-eVFLRxhh4nHvNqFQVl_YM4_rOdvgM.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-FRPV8U7XXrd_dx-0xooU5fXT-5eo4YNV0klO_2w6uaA.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-TDeR0yeIlGAHkLuOZQ3ZPzMvx-Y88lAEam7Fi6v-XzY.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-Px6-mQLXb1navgwzru3oxGwR8PoeVPvU_jO8c3PSsuU.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-xDfB949WUsdzlBtGBnnivx8bcHBBKFrvdGdfBgrpgyk.jar
    Oct 21, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-Uj21gxavBT7qzGmOZhHmg8Jco2uLHBfXa5HjuKfpQB4.jar
    Oct 21, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 21, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 21, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 21, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 21, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 21, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 21, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96015 bytes, hash b940a75fcb37be036cb5e347c4bf3544aeb039e15cc7a75f805d998364117a5c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-uUCnX8s3vgNsteNHxL81RK6wOeFcx6dfgF2Zg2QRelw.pb
    Oct 21, 2020 12:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 21, 2020 12:45:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-20_17_45_47-16699471151506461584?project=apache-beam-testing
    Oct 21, 2020 12:45:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-20_17_45_47-16699471151506461584
    Oct 21, 2020 12:45:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-20_17_45_47-16699471151506461584
    Oct 21, 2020 12:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-21T00:45:47.729Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 21, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:45:55.281Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 21, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:45:56.000Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 21, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:45:56.143Z: Expanding GroupByKey operations into optimizable parts.
    Oct 21, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:45:56.173Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 21, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:45:56.249Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 21, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:45:56.277Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 21, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:45:56.312Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 21, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:45:56.346Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 21, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:45:56.684Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 21, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:45:56.854Z: Starting 5 workers in us-central1-c...
    Oct 21, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:46:20.169Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 21, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:46:20.200Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 21, 2020 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:46:25.484Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 21, 2020 12:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:46:28.816Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 21, 2020 12:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:46:46.549Z: Workers have started successfully.
    Oct 21, 2020 12:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:46:46.570Z: Workers have started successfully.
    Oct 21, 2020 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:47:23.761Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 21, 2020 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:47:24.008Z: Cleaning up.
    Oct 21, 2020 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:47:24.074Z: Stopping worker pool...
    Oct 21, 2020 12:48:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:48:15.309Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 21, 2020 12:48:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-21T00:48:15.345Z: Worker pool stopped.
    Oct 21, 2020 12:48:20 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-20_17_45_47-16699471151506461584 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4ed4873c-66ba-4a26-92d8-2b84a7657d69 and timestamp: 2020-10-21T00:48:20.514000000Z:
                     Metric:                    Value:
                   read_time                    17.011
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 21, 2020 12:48:20 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 48.587 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 3s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/bej3s5px5cena

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1139

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1139/display/redirect?page=changes>

Changes:

[rohde.samuel] Implement the BatchDataflowWorker hot key logging

[rohde.samuel] Change key logging to allow for null keys

[rohde.samuel] Implement the StreamingDataflowWorker hot key logging

[rohde.samuel] change logging to enable null keys

[Kyle Weaver] Enforce $JAVA_HOME setting in RC build script.

[noreply] Check keycoder not null


------------------------------------------
[...truncated 274.47 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 20, 2020 6:45:50 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 20, 2020 6:45:50 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 20, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 20, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 20, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 20, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 20, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 20, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 20, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 20, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 20, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 20, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 20, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 20, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 20, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 20, 2020 6:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-jbW6E_96mGadYvA5ObReDoAncByRtQLCRm7fPrsmHPE.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-C2KvOtYKLtjcCJH1_OsNRltAJVpXV8v4yT7wZ251VtI.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-6ySIzSSMJhcIJqXjTd9WKDzE_jgLKyGmpDtFkc2g0XA.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-lWNSUXHdlLoOj-YW-IIPsHf4PkUZpg2mOxfY3ydcfqQ.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-dTzRtBAglDr5bmmC1lfBx_7qEOZO2HbfhSfW2FWyNGQ.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-Tvq2mvlQmuONlBVoIX6-DCE0n_mI7nxNqkTWgdcihmM.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Jf7LnynZysxjCN7YESDXkZn1gP8PibvqjBL1FBhvts4.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-fC2r1Ty3CguLJUsewQqLU_2TYQc0pb2siWr5NGejsoU.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-lXHEcCWaLp4UuXT7hYmz5EjPmGJban9W5-Nu5x3-5EE.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-437ES9eiMhJ1Y11HB8ayclU_ydbYoE3oMJXedDJOXGU.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-TzwIgbgJBQHEB_5iO8jbJIHfRElu2MQAmz9b8bT4u-8.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-8Lxzs8BC7gTWlXe1o2Etg2ei1PRsE3apPA-L2JKKM7g.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-p94N-n9c_EjNYNLwqLMmao5Bcv2uIofychfd825rhKI.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-kAvCa4mMM44u-kFTdC0A_GsET1Vi8-AYt2Phth5fkdk.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-z5JvRgCfF5_0VLObQPN5hduxU_c-hldCVoT_qKB0mPY.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-jbW6E_96mGadYvA5ObReDoAncByRtQLCRm7fPrsmHPE.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test138384536709586641.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-LKWJtl27MUTBW5_lBsFDR_3pJklE3tA-8SqLss40HoM.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-cTJEIZd-WwkIIP9eIk-E1Y6zjz3xNvIECNLZWjmZkrg.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-AyBWJFwi-tqM7Kax0AMkCpfDNTCAmvs-5g0y5JDwRVM.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-ARx14qfRGJpc0KqQje2qLc2u7GUQUx4njSen3RdtJvw.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-diFpt8zLPx1MfAvxVCfPX8Dp47zbOZH79MHuEaON6-o.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-f7ajKoVSG3ZHWuba_zJUAgvJgjxKFZe_14RUUG7iN8k.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-e1bHM_j1d7VFTvYa2tO7Z2zKNixVy7eV48v6-w8bi7E.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-TmAx8pS24gMd_2GcSeiziCKltEYtb0eidu1MwqN5PZU.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-WIY-f7Vam7zmqScjAnhD9njvZMNWLWxVYrPH4QmI_JA.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-cXXnfev2Hy82bOVt7KFoH9GGRdax6DtDEQTaE_Xw96k.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-VSkJrFGnXWn5gWzDdCTxdTCZaYPRrWjw-o4eThu6e9M.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-y_O4qWC_sdTe7zVevjl0NX_5VgTYS33P4zLb4GeWQ7s.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-QISqasz_ZfN_FeHMcfMFW49dE-r7lydUjSnLbpCBlmU.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-jd--vyOsLJvUdcBKr9pE7wclhnkGHhomFlpOkLkP7Rw.jar
    Oct 20, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-RJwekWd18H707fS_vAh76iOu8thJg8mt0m0doclkBro.jar
    Oct 20, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 20, 2020 6:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 20, 2020 6:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 20, 2020 6:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 20, 2020 6:45:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 20, 2020 6:45:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 20, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96014 bytes, hash 3b34bb572c9907a952c8f8b7fe461fc03da3d5d7ac35ce50c07229d1fc5d378c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-OzS7VyyZB6lSyPi3_kYfwD2j1desNc5QwHIp0fxdN4w.pb
    Oct 20, 2020 6:45:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 20, 2020 6:45:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-20_11_45_57-4849864584136633057?project=apache-beam-testing
    Oct 20, 2020 6:45:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-20_11_45_57-4849864584136633057
    Oct 20, 2020 6:45:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-20_11_45_57-4849864584136633057
    Oct 20, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-20T18:45:57.910Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 20, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:46:05.459Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 20, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:46:06.119Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 20, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:46:06.163Z: Expanding GroupByKey operations into optimizable parts.
    Oct 20, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:46:06.220Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 20, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:46:06.316Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 20, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:46:06.360Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 20, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:46:06.397Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 20, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:46:06.446Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 20, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:46:06.892Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 20, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:46:06.985Z: Starting 5 workers in us-central1-c...
    Oct 20, 2020 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:46:28.242Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 20, 2020 6:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:46:36.425Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 20, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:46:48.046Z: Workers have started successfully.
    Oct 20, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:46:48.083Z: Workers have started successfully.
    Oct 20, 2020 6:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:47:24.348Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 20, 2020 6:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:47:24.636Z: Cleaning up.
    Oct 20, 2020 6:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:47:24.761Z: Stopping worker pool...
    Oct 20, 2020 6:48:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:48:22.232Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 20, 2020 6:48:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T18:48:22.296Z: Worker pool stopped.
    Oct 20, 2020 6:48:27 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-20_11_45_57-4849864584136633057 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 042c9c33-1be8-4ae4-ad66-ec86999ff00a and timestamp: 2020-10-20T18:48:27.882000000Z:
                     Metric:                    Value:
                   read_time                    14.378
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 20, 2020 6:48:28 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 45.996 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 10s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/yzr5awzfgoy24

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1138

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1138/display/redirect>

Changes:


------------------------------------------
[...truncated 272.11 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 20, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 20, 2020 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-kSs_ZinjeNiADetG6MYuv997nzN05OGbyo5mbjUqne8.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-2By4mENItNhQgXu2wDxVIFLlDc5LtBOGNdAsKfMNgaw.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5593900902197656553.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-NlBvvbuwegIUYkLuLaXbG5STTAu0_qgWErrU3I3L2Bc.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-GHw8nb8DHzwIEcN3PuO3yBzVSkv0ACUkJAt5EvXiFqM.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-C-TtgRrn00bAyEun-urr8sqCgRUY8VVErA7jVH9LSHQ.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-_gCOOHAZXdUUF717KVAL4Q6XNrAl7fApBmVXtDF-xiw.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-AKTOIOu9OBsbimjWjwMNabw8kewouTI0lpuHMUrbMNg.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-4lpp5TKpUocuIfkSOK0NE6e9m7VPD6swWC1qfHWRd3Q.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-kA0I7eJ1gW-P0gVD8Jwqutb8lCRh76dbOjyjzXgsZ1I.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-vAwgH4gEJNwOMeoILrfEbJqZXUry3s9DMsIZN4uRZyg.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-5mQIE6dAPtV6t14EowD3ZYeabD-tvnQMuiggvxFS9NY.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-9AEYRkjSSihIBRCuaNuUp9DbcKjLFqaWcQYymUkwxh0.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-3dFPRuTkYrEiHOIXHLAAIVEy8OczcOwFUJPt9cf1yJQ.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-BUyL0cd_aadvahTDpoJyugFVd1w8lHM7ILwONSnMtWc.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT--5m5cAzVCwXbJL7vwE7pKuMRHfHBktlNFqMvBDTZXCw.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-JbXZGWHivZoHUN9dqmd1jEzXQ7_evbURimqMfHRqsvg.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-7ozHCJ1MH_SqhuZoGuXvBJqHqdRkAUzMnds_HzhDcmM.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-JHY3m0lKPDRJPBi5ZSXGlAYPgpx6CQ4Fl4XTpmJK1l0.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-Z1q8ZVbYXez2Vinzyo4XITXj6rVsYKXDl1T0jn1wLKE.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-UnE_YSzjl3EUp3Face2Xun5Xmox2DkCO3CLDr3ldRFA.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-kSs_ZinjeNiADetG6MYuv997nzN05OGbyo5mbjUqne8.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-B0swmAmLGb-krkqysTS60gGKhEqJSezJbDtb4lV2uts.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-9laKXLPC4YFTTPODuzSWanXDkpttvuafsh0j5MdVyL8.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-TiDS3JH_OSPpSq55M0QIh2uZXmrBFQKOAnJFViRni2I.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-aQ1Iem1aKJhokYauCbe5JZZBqkVNMXMivBh46PmGbow.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-yiviFeFiXBJ_Q314Xn1Ch7HfNUExhjhIxExIn6Xrxtc.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-P6Fdn9aeEX263WXRBCWMRE2tFNGW5meP6VpgGy5k_KQ.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-FwBsJj_fNyA9jEvg3SLPOvrUXsrAiSES0HOgBqTAxmo.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-Uzh7sptEaK3V1HLv1c65tEvZuqdOb1WO0BjRuRBKOD8.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-9N4db0Ul5mQ-9JEJvc3_DMTohp_AfAVuraMTHa1QwFs.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-b8raQ6Ru5RiKHewPA4-QS4KYoGbVuTY6r-Pr6BjwrFA.jar
    Oct 20, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Oct 20, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 20, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 20, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 20, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 20, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 20, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96015 bytes, hash d2a4fcf3050d115a6deb97c6f2974cc2191967326b14e7cdd22ad9886c9c2fc5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0qT88wUNEVpt65fG8pdMwhkZZzJrFOfN0irZiGycL8U.pb
    Oct 20, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 20, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-20_05_45_15-16955164064953839990?project=apache-beam-testing
    Oct 20, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-20_05_45_15-16955164064953839990
    Oct 20, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-20_05_45_15-16955164064953839990
    Oct 20, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-20T12:45:15.643Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 20, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:45:22.897Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 20, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:45:23.520Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 20, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:45:23.548Z: Expanding GroupByKey operations into optimizable parts.
    Oct 20, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:45:23.585Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 20, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:45:23.672Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 20, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:45:23.698Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 20, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:45:23.734Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 20, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:45:23.782Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 20, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:45:24.305Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 20, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:45:24.371Z: Starting 5 workers in us-central1-c...
    Oct 20, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:45:52.458Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 20, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:45:54.518Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 20, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:46:13.474Z: Workers have started successfully.
    Oct 20, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:46:13.511Z: Workers have started successfully.
    Oct 20, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:46:50.297Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 20, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:46:50.479Z: Cleaning up.
    Oct 20, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:46:50.573Z: Stopping worker pool...
    Oct 20, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:47:42.365Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 20, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T12:47:42.404Z: Worker pool stopped.
    Oct 20, 2020 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-20_05_45_15-16955164064953839990 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 07f206ba-b8f3-4482-9d42-6add0ae64b95 and timestamp: 2020-10-20T12:47:48.922000000Z:
                     Metric:                    Value:
                   read_time                    16.677
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 20, 2020 12:47:49 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 46.532 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 33s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/5jl3eawic6nqu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1137

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1137/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11074] Release guide housekeeping. (#13147)


------------------------------------------
[...truncated 272.08 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 20, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 20, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 20, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 20, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 20, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 20, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 20, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 20, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 20, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 20, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 20, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 20, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 20, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 20, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 20, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 20, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-1pORLYs6yOHuKoSlUYkq9cN9ynlyhctHPbp83CiUYzU.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tMlckR3sC686FgXLxLfRH6iqKC7y_MWaY9U8lRUEREk.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-NkACEwSwjZnmdKK5R_oLPsD90_VAqlgo9J2fA9oz1xA.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-ycSPLwUJZglpkypoL_ZVxcMKvvZ3UZZSKLV8PVBU5b0.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-3UrosiMHCYbBzcH8sOxM3q_3nnJTWYPxkMeZKrw3lJ0.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-QQVKQzKyH00WNPNaVs1SvGLZQ0FQsKzxjl7IE7GGJTg.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-vgYedhnKUg_fbfW-C5kIj85SmPvS0vBHAhzy7rKL5Ms.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-p3GFZitEWT4RxpgrwOBe-PZ5FkO6ANg530pKL3BPFso.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test888352362060411605.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-x-VSjtqgtOVqsOtq-Zfgl-oEYPYbjNULwDsGhTDoeL0.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tkHoGyA2hNF8PoUoffbNVVVdEU5taYXVFypknYr71q0.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-BW4Nxg_M9gQz05OBZMdxAdrSYjWCLI2hpEcMwfAoqxk.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-4oYedcNuRe6tsHsPF9YmY97mWA9G51Edv4vBYDWdbyA.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-B5Z_7HJTNaN7-iilIsGXn-E-bhYPanVLs3X0Lf4Q00g.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-eUbmJbPD5ZK1mVlZVE7KfxEXqSPDTyfY1opqRO1Sg9Q.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-IMclV1UjvBX61sxdO0RD1DAh9cs43Ci8eI424M5gMeo.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-AB-xsk2U34Leu-1FEPRY7bKhLddCnbR6H4tfWcGTaD8.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-csuN9eLN4s0W5X3XQ3TbepUuCAdS-ULg3wkDhi8OQvI.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-9tlmnnAhAbFXPJnqfQkSkfOOMljtyXHaA8tEjeOKENU.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-nyktlbCZJxeUMxVbk3Nt2SlBAfCVNjISDlWnIUrO0Zg.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-NljAHJALk2ijZ2F_xIoVNmG8vvWhNqc8TSYxvW10zHM.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-LhvAWoeRy9f8LS-6s6ogCRcPpv8zMGqIOqs7ndibqLM.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-Wy-841gpYaQUvjEsPMfz9ltj-EPslN86F4RkM2RtVhU.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-JPuadfVmRobqiwuQNSNIQXs8hV3QhLIS4oMDTGqtMgs.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-KiBoYRwTlQqG21kHW5BlZMvRqEYw0F5YUCKY_qbV4zg.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-MjCOmCiYbk-sNAiRORJexnF266htocMI8JUlHlDyNi8.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-QDK26z4FtbMt_SkxTootkuP_RICV1pr3Eq1bXub1NyM.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-wATUT3TYP8tH5nlYRd8LlTPtVLOCoUNK4FYU9HJna3U.jar
    Oct 20, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-1pORLYs6yOHuKoSlUYkq9cN9ynlyhctHPbp83CiUYzU.jar
    Oct 20, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-dXnyPJ9VO0FVvbLxt5swaYgwhUmposj1Yp_P9Baz4Fs.jar
    Oct 20, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-E9yQR7QNiVK2ABi4K98hHlSz_KMPzwy8biBMZOgUuZc.jar
    Oct 20, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-0X1QEKFMLkQX6BDCbtJEv9RCLjondFfveQW5motpm4M.jar
    Oct 20, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 20, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 20, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96014 bytes, hash 0f13ed5a2b81c2531d48c0d408ee0242e7a67400a5e02df9fba93a2940eb1db8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-DxPtWiuBwlMdSMDUCO4CQuemdACl4C35-6k6KUDrHbg.pb
    Oct 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-19_23_45_21-12778669957321367505?project=apache-beam-testing
    Oct 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-19_23_45_21-12778669957321367505
    Oct 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-19_23_45_21-12778669957321367505
    Oct 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-20T06:45:21.418Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 20, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:45:35.989Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 20, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:45:36.672Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 20, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:45:36.710Z: Expanding GroupByKey operations into optimizable parts.
    Oct 20, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:45:36.755Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 20, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:45:36.987Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 20, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:45:37.014Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 20, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:45:37.089Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 20, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:45:37.124Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 20, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:45:37.563Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 20, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:45:37.645Z: Starting 5 workers in us-central1-c...
    Oct 20, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:45:50.331Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 20, 2020 6:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:46:03.421Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 20, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:46:22.882Z: Workers have started successfully.
    Oct 20, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:46:22.919Z: Workers have started successfully.
    Oct 20, 2020 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:46:56.282Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 20, 2020 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:46:56.485Z: Cleaning up.
    Oct 20, 2020 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:46:56.569Z: Stopping worker pool...
    Oct 20, 2020 6:47:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:47:39.484Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 20, 2020 6:47:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T06:47:39.532Z: Worker pool stopped.
    Oct 20, 2020 6:47:44 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-19_23_45_21-12778669957321367505 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 604a5bdf-6adc-4f07-aa09-dea89512e977 and timestamp: 2020-10-20T06:47:44.653000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.265

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 20, 2020 6:47:45 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 41.211 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 28s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/4tc7en4xqszdu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1136

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1136/display/redirect?page=changes>

Changes:

[Robin Qiu] Ensure beam-sdks-java-bom.pom is signed during release

[noreply] Update go protocol buffers to v2. (#13115)


------------------------------------------
[...truncated 287.06 KB...]
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 20, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 20, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 20, 2020 12:45:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 20, 2020 12:45:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 20, 2020 12:45:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 20, 2020 12:45:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 20, 2020 12:45:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 20, 2020 12:45:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 20, 2020 12:45:56 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sXSTx4-Lw1lidoRJAHFrTYH4odRS7VKi22F4sYarFS4.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-BnT55AjjrUKZyVBO7gcfmnYRL-2wIKFWp17Gm4uI1LY.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-5wogwEKR06CMUV5Iir2eUo8tGP04e87NBAzUVKgxvyg.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-iNhWY-lNFJGDQfkN-xbtlx0on-VIN89PCshDLvxXPCc.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8447441632538950627.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-QZF0-FNH8rGNJqsEDr-a3uQoPTlWva5dOXXH6x-_c1k.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-E314Z-9aWBlxQ1SoyrcGUU3IihehrYtU3erbxvTv9xo.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-pc5KznCDBPY7nELEDAIUWVqU9LrfIBzUjTECwH0aNrY.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-Sx86pCPtnQsodr_Don0ovFEm5Zy6MEKf5f7PJjJh3Mw.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-pU3Pkdi6EgKXy4UIYbctHkEIuxkyEInBScldYMn2W08.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-P1rGZt2voOWVem4Ffb0ZTbY6N_sfQ1JOTBpXqb72Y-w.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT--51d7mA6o8VrZtYRT7btahq1lX3_3h16Z7oNw0ZyYvM.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-siFcvEdjl0FwEUYaH7nizPwYVSj97fugIYNojOJREg0.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-xIUfa0scAWjjfVNN6xi4fkSlpQ_kggO2DP-lcpt_fcM.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-P_6D9P5-L_n9HVU3JifAE_RQB3VX0Up7kIrrYH5oe90.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-c90ukbYgKNYX9Ob9LzqSNmEaOgWiGRJEtUJgkJO5VDs.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-aFSHu7I0A7QLaW8efye-fMl2q6Ioi9vQR05zslKrkxs.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-P9ESPupegQtPeB2dUQzY1AQa1wAuxlYCTde9W6olKM0.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sXSTx4-Lw1lidoRJAHFrTYH4odRS7VKi22F4sYarFS4.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-OZbxqgKmHNDwhTcsTTFIj49ug2pH65man8SvYEgHC3Q.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-iJts8Wl8ZKXQfh7d4ip1yV9l6dnAssZXpJLBKc88lLk.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-kRENAY_CBVRpBBRVcQxXIpKQAKvWsBHavGHWjDQx5Vo.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-8UMuQPha70fFHZENvoKQUeN1kQhpQcHH2zt3KWoFyFA.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-t5-UAStR-xA70A_5J6suAtdPIHY1QnMzZLJQU3-7A0w.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-tmci0HCfIwpPzrBY-Q4a_2jYKMTR0EgvG4oadFkBYcU.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-4lLzn_al5sn9BwZ56VrRhKWEeY9HadwXyiHVJiFYh90.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-fwcqFfrtsRDLSHcOdBQK9AqsR7Ztc4obQMh4MsuZPik.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-6eui5zZYy5C8TOgqM9kJH_g-AefUhTv895oSG-R6qAs.jar
    Oct 20, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-GYpIxHxetbSiHpfFpmgmCJAoUdHEQKjU7lio9kuDsNU.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-kDdqTc8ckV2hh0wBEGvi35TJaotBAPiy9YHA9JO49WY.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-nUi83JJ0OpzeVsCMlz22voIXuSQVI5tttAlsCiZL24U.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.mongo/2.2.0/781d14f4e3d9eeb0b4c3e00a4ec165a04b3dc5c4/de.flapdoodle.embed.mongo-2.2.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.mongo-2.2.0-vNy3lJC0jW9u4Cy1AHsqSbjRUqOTX9ycpEmHkht7vvk.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.process/2.1.2/986b38302fa10018d5baccee7f655c31ee9afe5b/de.flapdoodle.embed.process-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.process-2.1.2-OasY7D5KRAimcZcWcjFwgi8Qb4B-iff1FfrVeNSih6A.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-csv/1.8/37ca9a9aa2d4be2599e55506a6d3170dd7a3df4/commons-csv-1.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-csv-1.8-qL1WZS7UZo2dWjOZSuUvWbnjnI6w68tmhOaK7udXmmE.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-calcite-1_20_0/0.1/6d16a59dc771784789116607a04acd9a0ef91d16/beam-vendor-calcite-1_20_0-0.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-calcite-1_20_0-0.1-1NrX_9FNKiEqNk5qBOaRlj-IwqOvKvQIGIbTVgm_v8Y.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/janino/3.0.11/e699e368095379ba0402ea1780a87fcaea16e68f/janino-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/janino-3.0.11-kje3HSMpGA5ZIQ6aqhAO4xNFTvCuWIYIx1yxkxlZG-E.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-GEXtmyVuhoNyYMNmEyaC-XHfdJe-5Szke2plahP4KnM.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/commons-compiler/3.0.11/f2a6ec7fbc929c9fc87ff8bb486c0574951c5b09/commons-compiler-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-compiler-3.0.11-DxpPXyZccBoxkzJErnBF_O8YtPpZUEF-Je5wvlDd2s8.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4-runtime/4.7/30b13b7efc55b7feea667691509cf59902375001/antlr4-runtime-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-runtime-4.7-KmGUP4A7vR0OAt_9GbkqQY-DNAyZQ0aAnjtR4iMapsA.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/ST4/4.0.8/a1c55e974f8a94d78e2348fa6ff63f4fa1fae64/ST4-4.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/ST4-4.0.8-WMqrxAyfdLC1mT_YaOD2SlDAdZCU5qJRqq-tmO38ejs.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.glassfish/javax.json/1.0.4/3178f73569fd7a1e5ffc464e680f7a8cc784b85a/javax.json-1.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/javax.json-1.0.4-Dh3sQKHt6WWUElHtqWiu7gUsxPUDeLwxbMSOgVm9vrQ.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr-runtime/3.5.2/cd9cd41361c155f3af0f653009dcecb08d8b4afd/antlr-runtime-3.5.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr-runtime-3.5.2-zj_I7LEPOemjzdy7LONQ0nLZzT0LHhjm_nPDuTichzQ.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.abego.treelayout/org.abego.treelayout.core/1.0.3/457216e8e6578099ae63667bb1e4439235892028/org.abego.treelayout.core-1.0.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/org.abego.treelayout.core-1.0.3--l4xOVw5wufUasoPgfcgYJMWB7L6Qb02A46yy2-5MyY.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4/4.7/cd6df46532bccabd8127c18c9ca5ef481962e931/antlr4-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-4.7-eGclcCizNzrwEd7nts6bWHqP1cegsl9osv9MuQvoqgc.jar
    Oct 20, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.ibm.icu/icu4j/58.2/db9fd4b4c189cf1518db14c67d14a2cfcfbe59f6/icu4j-58.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/icu4j-58.2-lT4eg7K-fD6i-I2obBNhT0fp5x01eMhSHX8Yd1a2OWI.jar
    Oct 20, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 176 files cached, 43 files newly uploaded in 1 seconds
    Oct 20, 2020 12:46:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 20, 2020 12:46:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 20, 2020 12:46:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 20, 2020 12:46:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 20, 2020 12:46:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 20, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96015 bytes, hash abd4fb2ce25b1cdd6017dfb0af1b0cf2b6765164d1f31faeae1fabf4b447f794> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-q9T7LOJbHN1gF9-wrxsM8rZ2UWTR8x-urh-r9LRH95Q.pb
    Oct 20, 2020 12:46:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 20, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-19_17_46_01-10654223772275138856?project=apache-beam-testing
    Oct 20, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-19_17_46_01-10654223772275138856
    Oct 20, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-19_17_46_01-10654223772275138856
    Oct 20, 2020 12:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-20T00:46:01.915Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 20, 2020 12:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:46:13.005Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 20, 2020 12:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:46:14.532Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 20, 2020 12:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:46:14.584Z: Expanding GroupByKey operations into optimizable parts.
    Oct 20, 2020 12:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:46:14.646Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 20, 2020 12:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:46:14.835Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 20, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:46:14.875Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 20, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:46:14.900Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 20, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:46:14.943Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 20, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:46:15.378Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 20, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:46:15.493Z: Starting 5 workers in us-central1-c...
    Oct 20, 2020 12:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:46:38.011Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 20, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:46:45.490Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 20, 2020 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:46:57.939Z: Workers have started successfully.
    Oct 20, 2020 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:46:57.979Z: Workers have started successfully.
    Oct 20, 2020 12:47:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:47:30.073Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 20, 2020 12:47:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:47:30.275Z: Cleaning up.
    Oct 20, 2020 12:47:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:47:30.377Z: Stopping worker pool...
    Oct 20, 2020 12:48:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:48:21.934Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 20, 2020 12:48:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-20T00:48:21.982Z: Worker pool stopped.
    Oct 20, 2020 12:48:27 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-19_17_46_01-10654223772275138856 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ecc7dabe-01de-4db3-8b33-cf9c595c3957 and timestamp: 2020-10-20T00:48:27.878000000Z:
                     Metric:                    Value:
                   read_time                    11.845
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 20, 2020 12:48:28 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 42.844 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 11s
107 actionable tasks: 78 executed, 29 from cache

Publishing build scan...
https://gradle.com/s/vtbdqgyru7fbe

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1135

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1135/display/redirect?page=changes>

Changes:

[tobiasz.kedzierski] Remove Apache license from metrics mail

[ajamato] Rename ProcessBundleProgressMetadataRequest to MonitoringInfosRequest.


------------------------------------------
[...truncated 290.78 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 19, 2020 6:46:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 19, 2020 6:46:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 6:46:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 19, 2020 6:46:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 19, 2020 6:46:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 6:46:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 19, 2020 6:46:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 19, 2020 6:46:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 19, 2020 6:46:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 6:46:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 19, 2020 6:46:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 19, 2020 6:46:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 6:46:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 19, 2020 6:46:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 19, 2020 6:46:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 19, 2020 6:46:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-XmT6avB9Emzf-AwiMySb2k0rMSQs0vzb2UZtDusDxYw.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-anl1vQUICh8-fadYXW9eSNDZnjPekRvit6vrYPu9LkM.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-trLsdTIEGm3FfLmk91NKZIv5z9j1c0J2_ftJWFPqAFs.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-nndw8BYJEVEkirYQHsm4wiWbvS_A_-me3dYKFqhH4Gg.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-0ysIiK0S818Lb-fwn3WtLzDdr27By7PW9OYRRu2CrWE.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-ltO6z_xhJO8Onf3L_ujEPt2gnp31wzZBgSK59HGf0FA.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-XmT6avB9Emzf-AwiMySb2k0rMSQs0vzb2UZtDusDxYw.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-kwvW8X01EP2MG-2A77_FV2EEZwnlQH8DGr5oqGYrvlI.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-Ru7U73WfEZewlUiq5HIMMpr4LzRqA79osN821MCm2Cc.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-NapYo6XwACLzr64yC1Aej4AeKtszTxoVIemdYQAytzQ.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT--JrIwG0igSKSvieNwokkXGeVxGQinp3hgMv-Vjd7zTA.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-T63-nHzTebW8xjGZSf0cnVcpsG7cplPLA3iYlTxl8lY.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-9xON_zke6hI7AeoCfaDxzuoas63HQvCCHmUQn8ghS50.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-7L-xVK0FqFDl9ysjiVU51VY9SXIO-hEb1LAGuYFwkYE.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-RJD6dS6tg9BGLxgNtW9vrXGrM8MeNuG9IQ1tLpOrYdc.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2663044678029323320.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-BFMv_rqarkOjAWmVkMANIrOURkc46Ma2qYYrOCIUYV8.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-o19C3Z-ZOR2fYodJiv889aHSnBmMXX3tXZcMuYrvduc.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-tAEgiFGQ6OMT3X8mAXsPHvicWjgpCNWX7EN-i5bN3f8.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests--FFCp81xIwKoaaVzlDUMk0sGUnM14s-ZaWbILSXOmrM.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Fh6J19EYxk-BzvVAsB3jLzQIwwqsbRxgqkSbRDKbZEo.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-qs7Ur-laAUdo2zX-mJ90inx7dY-0N_8C7rZoOWC5Im4.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-JqyFpzGEgR5suFsllUe425IqtKrb89eGzVcSMy3pL7I.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-3zBM9kJ0Zb0b_9su10bHPJpxhQK6CszJaOTUJAyByqY.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-2rBFnhw3fmKwWsAKu2-4Uy52AsxeNQG7Z0MYuvYBlNw.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-c2jiPT-pTnm-Sg95-ClXUNya8KXuSvQEX9lVetZQt3c.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-C7hJd1OGOtzb38TLX9_tTKd23RZ0lMwzF-nUX10bzv0.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-65OEVvhL20BWF2jAht_0-fAXP5EmsqrigRkXcGI9Tmw.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-hRlUnicDOQcIIQe84YK_Qk0QsKKZHjNhu-RXtwDwYyk.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-Rpu7PPOrXJ2ABASBDnPy5igSjczMIF0DYBNrXxCnFzg.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-VdvFyuHUX7q7hEuxZH0eZg6AcmNpxJTqX696PDr6Dak.jar
    Oct 19, 2020 6:46:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-wAUMFniGPuojjWf9Q6RH30tVZ5Xrl-XZgsxYRRBAKgw.jar
    Oct 19, 2020 6:46:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 19, 2020 6:46:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 19, 2020 6:46:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 19, 2020 6:46:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 19, 2020 6:46:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 19, 2020 6:46:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 19, 2020 6:46:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96021 bytes, hash d3dfa87c3af3694bb147f402ea110e509353dcb458d178a610f98371c5af6e20> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-09-ofDrzaUuxR_QC6hEOUJNT3LRY0XimEPmDccWvbiA.pb
    Oct 19, 2020 6:46:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 19, 2020 6:46:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-19_11_46_36-7248686492500622736?project=apache-beam-testing
    Oct 19, 2020 6:46:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-19_11_46_36-7248686492500622736
    Oct 19, 2020 6:46:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-19_11_46_36-7248686492500622736
    Oct 19, 2020 6:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-19T18:46:36.178Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 19, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:46:44.946Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 19, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:46:45.637Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 19, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:46:45.686Z: Expanding GroupByKey operations into optimizable parts.
    Oct 19, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:46:45.730Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 19, 2020 6:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:46:45.886Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 19, 2020 6:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:46:45.919Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 19, 2020 6:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:46:45.953Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 19, 2020 6:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:46:45.983Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 19, 2020 6:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:46:46.336Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 19, 2020 6:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:46:46.418Z: Starting 5 workers in us-central1-c...
    Oct 19, 2020 6:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:46:56.531Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 19, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:47:09.801Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 19, 2020 6:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:47:30.930Z: Workers have started successfully.
    Oct 19, 2020 6:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:47:30.966Z: Workers have started successfully.
    Oct 19, 2020 6:48:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:48:10.276Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 19, 2020 6:48:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:48:10.440Z: Cleaning up.
    Oct 19, 2020 6:48:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:48:10.510Z: Stopping worker pool...
    Oct 19, 2020 6:49:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:49:02.695Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 19, 2020 6:49:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T18:49:02.745Z: Worker pool stopped.
    Oct 19, 2020 6:49:09 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-19_11_46_36-7248686492500622736 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d9a65cc6-b7d9-488c-87e2-cc1c4ac99eff and timestamp: 2020-10-19T18:49:09.301000000Z:
                     Metric:                    Value:
                   read_time                    17.321
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 19, 2020 6:49:09 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 46.87 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 52s
107 actionable tasks: 85 executed, 22 from cache

Publishing build scan...
https://gradle.com/s/rtlmhjxbigtoq

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1134

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1134/display/redirect?page=changes>

Changes:

[pawel.pasterz] [BEAM-5573] Update jna dependency

[pawel.pasterz] [BEAM-5573] Remove jna dependcy

[pawel.pasterz] [BEAM-8643] Update elasticmq-rest-sqs

[pawel.pasterz] [BEAM-8723] Update postgresql

[pawel.pasterz] [BEAM-10370] Update jedis

[pawel.pasterz] [BEAM-8928] Update clickhouse-jdbc to version 0.2.4

[noreply] Delete myself from mailmap


------------------------------------------
[...truncated 274.38 KB...]
    INFO: BigQuery method is set to: DEFAULT
    Oct 19, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 19, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 19, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 19, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 19, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 19, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 19, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 19, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 19, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 19, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 19, 2020 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 19, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-pqWnE-Zhv3LbBM0dxxITQCxv6x6CAMDrwoSNXgFUps8.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-4VesSm8JaGy88j5vtkJF-4YffsNz_vu0vdc5OhKIfRI.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-DLKRin_UvuxpEPM12njvUE71JYOeUGIBE6Kng20cAaQ.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-jsvk5DMAJ8HaEB7XaM0c14lQclcZ3Rcq8yDmBHmUSX4.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-pqWnE-Zhv3LbBM0dxxITQCxv6x6CAMDrwoSNXgFUps8.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-rfvy4xACly2VC5-TmI--362YY1CRR32MWvdNRr0a5yY.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-rqBTrAMsXXHsimPj3iT7HEqwz2jWTG-H2_C9NkZURV0.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-jYMm678iTq8ogg_znM8_AnY2vRhYOmyl2ZkplhuecNY.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6365747605328316752.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-vP7UN3_wEvIshppiIpJyQM0iOrfQ-hsSrF-XboQavpY.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-O6eOtLoK5FQXRktB2QkxMByPwowIqz4eVL0ruenAY_s.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-wJ-GWAn6hd94jLUOghIIUFeXo_sQaK5ilvttWWR4JBA.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-L_I7hraARYDA-Oa4dS_ZdJIdY_en7NAOi3NXKcqVI0U.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-efL4YSVcA79pYwX5ethDN3FBZN__wL2ZlJAxisuVLXY.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-3HaKtkcfq692skDGe3_q6ORwAIFM3YAmDS29bnKE6ZE.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-PtxdWDEjXksOQRJn0BcxyLoAwLIAJahEaAkw5kmj0og.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-Q9lCGTwSz6MLPjpfkSgPJg6LgcgiII-cCmcOsFWoB0E.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-pTOCgbirQsE1sSOrTFUHVQMVa9erAe8H7aYsyAIy158.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-NdNr34wOJY1PStju1-winWI7ObO1pYrG-uYHBnVPyVg.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-GG5uj8wn5XMWUnGjamuewzpqWXRG2y2Z_zo2plSo_sQ.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-lLoj33u7OcCnEENxCp6st6cj9xLyGKTyymj4PxRdTNo.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-SMMvHwNV5eliHdiRNvX2G4LzEqhPglVWh0B618RZX-A.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-BSgMBGAgXF9giztaOKFDgRBHe6S7IJG5H3NbwrjHP9U.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-ryppjC35gDXmY-M2lxKIIeZRqbhAU7zqA2Z2R86exzQ.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-6D1kZK3AVz6iyNitGgAPKjMHy8Rs8hA8sDM5iQVNNmU.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-TxEIpwr_O3woKRqsWptdVQ_oGwksiOljI_IzsFe2mTY.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-AjbRnsCgD-PzN7jV4_aR39VopUulXmaKiIojfZSNVvg.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-vFyeUn3eHdCzLBh_A_wGMlOBNQNa3wr54SSg39gZjoU.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Ompyj5OfRKzNOjf7U6qQrAJOTdyL7q92DHfgyfPYcig.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-J6DTS06-K1Lhtyite2QrPmmndxPdZOzojXEXEAH9d0g.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-XNIJtsd6gKjMYnPJgrZQ1_e9GMa_I2BJvoKORAp9Hf4.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-LkMzZ7rACgTXbshGGEy55AKEzqPnBo5O5NIvmCtS1QI.jar
    Oct 19, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 19, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 19, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 19, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 19, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 19, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 19, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96021 bytes, hash dd76580ff3ee31ec6b5d02dd1226561bd8ca43d96646576de9c37c2a05831633> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-3XZYD_PuMexrXQLdEiZWG9jKQ9lmRldt6cN8KgWDFjM.pb
    Oct 19, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 19, 2020 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-19_05_45_37-16649460193199380017?project=apache-beam-testing
    Oct 19, 2020 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-19_05_45_37-16649460193199380017
    Oct 19, 2020 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-19_05_45_37-16649460193199380017
    Oct 19, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-19T12:45:37.628Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 19, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:45:43.990Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 19, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:45:49.566Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 19, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:45:49.604Z: Expanding GroupByKey operations into optimizable parts.
    Oct 19, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:45:49.631Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 19, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:45:49.719Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 19, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:45:49.745Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 19, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:45:49.773Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 19, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:45:49.814Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 19, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:45:50.167Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 19, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:45:50.248Z: Starting 5 workers in us-central1-c...
    Oct 19, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:45:58.030Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 19, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:46:10.811Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 19, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:46:10.839Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 19, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:46:16.122Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 19, 2020 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:46:28.612Z: Workers have started successfully.
    Oct 19, 2020 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:46:28.638Z: Workers have started successfully.
    Oct 19, 2020 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:47:12.197Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 19, 2020 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:47:12.395Z: Cleaning up.
    Oct 19, 2020 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:47:12.483Z: Stopping worker pool...
    Oct 19, 2020 12:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:47:56.703Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 19, 2020 12:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T12:47:56.739Z: Worker pool stopped.
    Oct 19, 2020 12:48:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-19_05_45_37-16649460193199380017 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8c7fd0a5-f45e-4070-8e80-c3dc33eea79d and timestamp: 2020-10-19T12:48:03.196000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    22.567

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 19, 2020 12:48:03 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 40.037 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 46s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/xfcmkbf3oeuoo

Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing files not accessed on or after Mon Oct 12 12:44:21 UTC 2020.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 0.189 secs.
Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1133

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1133/display/redirect>

Changes:


------------------------------------------
[...truncated 273.60 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 19, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 19, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 19, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 19, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 19, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 19, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 19, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 19, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 19, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 19, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 19, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 19, 2020 6:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-j1qZsP6z1910NF4XNhesXEHYY8iLJ1TPJ1WuU7a18Ck.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-7IqaM_rMiZTRAbPYu5QM-cJvT8DLqXmm5rJ4cmlX58g.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-s3Kkt-7oECNL2aR6pPbXISjJABIzYc4iqnJASw4hcfo.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-jwK_0HXRPU3H6WdxiFdxUup-VKn7ZKBvehfkUNEYFZ4.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-OBgk7z0aZYzGlT3Whwycf06M4vmxU4b5663ElpaM4Lk.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-UU9YVlFTWwTcUWBdYP1NZ8wZ9rIiVN4kYE23WOIkuN4.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-tgYD3czgLlACQhj_7Mm70iu6pk5EqcfrdqqpAW92tZI.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-UFq-AGc_olhjQta8cYCu3RxMCtH88U7zIcxGI3Bd1R0.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-NNRoTpXaze137wo8L91Wb2OGUIFb76GdTWDvJ6_RfgE.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-j1qZsP6z1910NF4XNhesXEHYY8iLJ1TPJ1WuU7a18Ck.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-FpwUr6Od6k9VcubLafdr3dwSBRwZjGvya_jYF1OMFq0.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-2D99gIwvejGHhZDnfGAQNSsEfKIFvHDg9v5culxub2c.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-otGq9T3ibxRhyedufmLzaTOA-mADmrVyv-3pUU1dJPE.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-YcS0Lrt8NzlJhgpTthc9PbNvCOu1YpcnGWPRKEgj-7k.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-6Lk2CVRgjbfzAOu8bGrSQ78UyZGWqlTvPGPnZQPaWNQ.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-rydWGMKVHo6yJdDTydJMcRQTdFeMvklS0dZ8Wx_Z4kE.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2885559959438821197.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-vtkiYohlbLad6BhpEpbCBL7jbVsQ89bjTfjwxqZdNw8.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-g0ChOsUGNTQl5eF6PnjonJXnLMtT-Mak0dNROR3ja5w.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-MpWHIvSq2QEdJ4zyJ_chkWz2hhc0buh37fD2_enlfPU.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-0FCqzaKpnLnDwulHRpCkvtX6YPCEcxb2Mv_FJl_NI-Q.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-hkaXHmoJ7vBMiN0PvjwI1SbziDz1xRrUqWikKDvjfbU.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-bemvRZpzlJEwMVR2HlBROy0uzsZ9tsgcJ6RX_UTovvE.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-2qO2WgMRWpLGUg0fMpwSSZnO-_CTqerYnAysuv89kQQ.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-ma3EQ5rUrkNCJN0ASfpS_VYuE_uQNC4cZS3a0Wgq5Qk.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Xjza0ZVf5dqsY0i2t0VgUijprI5YIZj069R44INKCCM.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-eB79JGsMdksNiDefqoZpgzgEf1ysBRvZwyS-XM3Y9L8.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-NVF5imzIIM2v1aEUowMQbcslLsjdW1ngn_003YR1Y8k.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-qlxPnQHnbsg0eOBb_9sHKnpNrMI0nWFXD4yJqBK3pJg.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-vLa1s3PbNs9oJPQDP8yqjikBi26DKNLRhqWnC8842J8.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-LX5f7N2a4Dty9HejioHS2G1MgI7vgBfvxMD0QqeZriE.jar
    Oct 19, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-kCWpQrRHPWprLD_cBwJEY-SelCxcTdEdSVFXJa1PJ8o.jar
    Oct 19, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Oct 19, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 19, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 19, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 19, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 19, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 19, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96018 bytes, hash 416294b2b74cf11522774d1c903f736e92ec3e752cc1f0090e45d95c6f7ceba0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-QWKUsrdM8RUid00ckD9zbpLsPnUswfAJDkXZXG9866A.pb
    Oct 19, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 19, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-18_23_45_14-286417872625503745?project=apache-beam-testing
    Oct 19, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-18_23_45_14-286417872625503745
    Oct 19, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-18_23_45_14-286417872625503745
    Oct 19, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-19T06:45:14.808Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 19, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:45:21.805Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 19, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:45:22.378Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 19, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:45:22.419Z: Expanding GroupByKey operations into optimizable parts.
    Oct 19, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:45:22.447Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 19, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:45:22.613Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 19, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:45:22.641Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 19, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:45:22.673Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 19, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:45:22.709Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 19, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:45:23.135Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 19, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:45:23.215Z: Starting 5 workers in us-central1-f...
    Oct 19, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:45:47.352Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 19, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:45:49.399Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 19, 2020 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:46:07.945Z: Workers have started successfully.
    Oct 19, 2020 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:46:07.978Z: Workers have started successfully.
    Oct 19, 2020 6:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:46:41.551Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 19, 2020 6:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:46:41.686Z: Cleaning up.
    Oct 19, 2020 6:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:46:41.759Z: Stopping worker pool...
    Oct 19, 2020 6:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:47:33.332Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 19, 2020 6:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T06:47:33.423Z: Worker pool stopped.
    Oct 19, 2020 6:47:40 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-18_23_45_14-286417872625503745 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 17b2365e-6c53-402f-b1d6-598e1f64b6c4 and timestamp: 2020-10-19T06:47:40.716000000Z:
                     Metric:                    Value:
                   read_time                    13.447
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 19, 2020 6:47:41 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 39.197 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 24s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/jp3rwjpge44pe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1132

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1132/display/redirect>

Changes:


------------------------------------------
[...truncated 271.20 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 19, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 19, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 19, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 19, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 19, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 19, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 19, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 19, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 19, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 19, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 19, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 19, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 19, 2020 12:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-K7qBc6QqNjR-eUShdfVo7PEfSvqjGbNVaQykBYbz9ro.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-ghZoouaofY0_tJxWt1eowUmqv8zxm09Xjq9tSFjedps.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-8vvHxL1jY6bOEqNtjNC3VH2t4FmnFsf1QUskE0UMmd4.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-T1Ovt4jCuV_s-SKWubVM25FcAHyCh0hsLbXMxGub0PM.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-vom4t5onfiUMJ-lqoH49KegtwJzcq0Pv8vJECMG0pts.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-JXH97wfzbkN07uFFrtFRTxfWGI7WVRsH-QXnX521Upc.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-aW98m7lJwaYnDhvpTCLGig73vP-p3NsPGXhpvwr1KTY.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-E-zjDLctbQCQRic8noTF0Oy89ufT9om4UrvhClD2B58.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-xsXcjsWOWT4i1T8HZ57FRpY_Q2PTUaKfNInP7W_f4B0.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-9kOimg567PEdWfVIBxRYjdLRYUwC4cUL-vYfqezQn4c.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-1QXEzE3x6zsOoUKfpj5bGQ_jexsuzfA_GaZTgwVFhs4.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-WZ-vIHJGZPNmiONkVFPyT9NEXfjz0se5tT4k2j3tgCY.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-rTWBENb0Xfcc4sy8O-cimXVyYSRzyIzB9VH7j-nwAL4.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8737061183911732149.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-TC841f-niUU0tOFyhpK-Pm48X67R9oTlSAd9Y44GjHQ.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-68NdlEs0qZ_CcP2KrIQOnJIKUc0T3gHTouG_Da8-e9c.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-WR_shzBAkTyoEWD0bgKFKlac4OAkH4UqJI8K8-VTsAQ.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-K7qBc6QqNjR-eUShdfVo7PEfSvqjGbNVaQykBYbz9ro.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-tLPI4rJr3DfIiPICj-M3ty-AsQT8UvnOhrxha8jCiEc.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-VPkHwbul4i2PAVMb1Gcw0INjvsZWk4Zk5iRjqY7b6p8.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-1-wLcgIYI3peD-Xni5Z6vg7zq_fNathx7nnlr6-P1u8.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-FYS2Itect17SlniKbfcY_8G0t-7B3wGBTMflcM5WPB8.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-mXglVKt6bF_J38dFR9k9HhLD8jLaQAka_NJE6fuYEDU.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-yvMQvVQM1h0DNYfcamCHh14qdli_fqkMrPOkBADrvAM.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-xCoL7Vfzay5XNMnXPZZyqOsP6tJ_2aB2NOnPI74cyxc.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-2d5tNL97X3Bwf54mluEWq31e85BGlPPbbPbeTmcGTHk.jar
    Oct 19, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-QSzU2sAqFvUrz80SSKza8ZCaeHp5sDKMxwttk_SAX7Q.jar
    Oct 19, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-3cIcOo73qemoJL0C5nwDXtfTaXUxPQJCewqtH1hAy_g.jar
    Oct 19, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-wIvNOC3a60bWZb6WYUGW8coVYz1x4SxQJ97rWBHQMlg.jar
    Oct 19, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-7UZlPCJRjSvrIWKU2C9XdSVvqAO0XLEMvhzg5dh8tRQ.jar
    Oct 19, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-96L8xPokPafblXrBGn7PRa6bbABoc_B4CZLcEsMIFfs.jar
    Oct 19, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-AzZjena6CJSkzRR8qy9ND30In-2beg63ZbOtuBmLN4g.jar
    Oct 19, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Oct 19, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 19, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 19, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 19, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 19, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 19, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96017 bytes, hash 6e566814700999c18384c0b3371433ab973a31112e7bd856b03a4c926c4cdc10> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-blZoFHAJmcGDhMCzNxQzq5c6MREue9hWsDpMkmxM3BA.pb
    Oct 19, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 19, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-18_17_45_19-7394122383023338468?project=apache-beam-testing
    Oct 19, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-18_17_45_19-7394122383023338468
    Oct 19, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-18_17_45_19-7394122383023338468
    Oct 19, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-19T00:45:19.276Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 19, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:45:30.052Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 19, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:45:30.790Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 19, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:45:30.834Z: Expanding GroupByKey operations into optimizable parts.
    Oct 19, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:45:30.867Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 19, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:45:30.944Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 19, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:45:30.969Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 19, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:45:31.016Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 19, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:45:31.050Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 19, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:45:31.446Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 19, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:45:31.525Z: Starting 5 workers in us-central1-f...
    Oct 19, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:45:58.130Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 19, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:46:04.143Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 19, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:46:17.237Z: Workers have started successfully.
    Oct 19, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:46:17.275Z: Workers have started successfully.
    Oct 19, 2020 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:46:53.934Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 19, 2020 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:46:54.111Z: Cleaning up.
    Oct 19, 2020 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:46:54.220Z: Stopping worker pool...
    Oct 19, 2020 12:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:47:41.596Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 19, 2020 12:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-19T00:47:41.671Z: Worker pool stopped.
    Oct 19, 2020 12:47:48 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-18_17_45_19-7394122383023338468 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ac7ec8d4-cc2d-48d2-94a8-b9278a46c8dc and timestamp: 2020-10-19T00:47:48.480000000Z:
                     Metric:                    Value:
                   read_time                    16.855
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 19, 2020 12:47:48 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 43.077 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 32s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/cpxtkt3zscbcw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1131

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1131/display/redirect>

Changes:


------------------------------------------
[...truncated 272.07 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 18, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 18, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 18, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 18, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-tRLYNaqzB0Fa9IZUhB8iAAc3bK7BJDh0hd00Nn2s3-g.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8737113483488144244.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-_4lIVn8jpD-QRiVAtSMvXYRyiPUyh_pZmc-90upjxWE.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-r_w5DwjJH_OiOwRKTjm9tlvHzfevDi70Cqudpvg707I.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-pX4JMrHV__x8bvdP-JvClwVEn7FKPkyPkm0HN25BPYE.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-hjplmCXQln7MYn7qDtzFe4m2S3DJmt38rlFIH6yaKgk.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-eU0tiQDxln443R6Z5CpEA4ihwCngHbZ-sJ6mUci6hiw.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-zBjudZiT_ZXbtQt3YR72XJDKa_9bV66xYhjglFfZPW8.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-kwK3OfUl_dP4PYKcax8i3E_1y7vqS_rvWrtRR5q7RZ0.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-T8bZ0a6OKvM7-iUiI3PihEf85dzioCL3rGX3-OdnEnU.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-g7h3hVKUriPBk1XRBG6RpUthtZYy-ZUzJZ5slSW8RLM.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-AoR2GV9u8P21RYtLqueeJbnwR97RU0gE8tcEevMiH_A.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-TDi9y1ZQz5peuBueVz-7nbcqQjwZHKo2yI1VTXyDHJY.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-tRLYNaqzB0Fa9IZUhB8iAAc3bK7BJDh0hd00Nn2s3-g.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-NJOI0dCQlkYx470GH29_SCZ_AUv8fxo1_8H8pHEsy34.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-HLrCDL2X-VY-b-Bb3nv0hQSvIDex7So8YhidZXjY0j4.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Dddjeyokwy1xJRXPHojpniTKxzsszKTN57wlqup7EWQ.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-3rbjOy19mblLPFd6LOisrvUO0i3RxyR35ZyosrSdQpE.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-dPfJr2PdCphy0n-HDDWWh0Z5Z8dLtut2a6iuKHFDka8.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-NQnqO8-BiqnVjmTX9ax_r97vYPtrpaNnHNsE7el8mLg.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-ow-JePdZzwSq_9BK4N-nHHMs1pUPruI6MsyMGxg7bzI.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-n6zrTLyHmVyCx6x5N8zO65IUPm0yQzq1dcybmOzi3BY.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-xqqNraFcxN8NugzDqjYyIF_2dXK-QrD4WgR12S1IFpk.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-FCp6L421vSJrwOL-3RXZzib2maF5hswXsYeuu19pCLY.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-gCfLPYGDXCSLbw7GW2vBe1DnVegdHS6l1abwxkODlWY.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-51SpFru-YJhPGoh7TjnLw5aU7MkYPJbzrFqupuQAYDs.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-AJ6beOrxfnVAclVdd519Hw244TxytfeOCFb7pSZ3eyk.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-A5Mg29P3oB7MSp5Z8ucp9npL2yh4tDgiVP-OSBVB-Iw.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-pk6eG2MEyqW0e_tRPxoZCDRlBb2D4rI3UpDhWd3Wlrc.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-YfWhf8EZ1WXkmvK0p9xbNrDsaMQWiQbuHxqpzNVSHqI.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-Ai2grjlkh6XbLwGHEizXa11rwV4y0bFG6DGXBiHhscU.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-7aol5a14NDW0_TxidBdOGK6Zk4ZmhdwSSY5vRnD3B5Y.jar
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Oct 18, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 18, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 18, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 18, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 18, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 18, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96018 bytes, hash 35f99072ce218febc9264cca507914e794496cfc92e2a775a5f7fc164c38416c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-NfmQcs4hj-vJJkzKUHkU55RJbPyS4qd1pff8Fkw4QWw.pb
    Oct 18, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 18, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-18_11_45_16-14382144578797941294?project=apache-beam-testing
    Oct 18, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-18_11_45_16-14382144578797941294
    Oct 18, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-18_11_45_16-14382144578797941294
    Oct 18, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-18T18:45:16.429Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 18, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:45:23.420Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 18, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:45:23.991Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 18, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:45:24.020Z: Expanding GroupByKey operations into optimizable parts.
    Oct 18, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:45:24.057Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 18, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:45:24.129Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 18, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:45:24.157Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 18, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:45:24.192Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 18, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:45:24.251Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 18, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:45:24.584Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 18, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:45:24.654Z: Starting 5 workers in us-central1-b...
    Oct 18, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:45:38.941Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 18, 2020 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:45:52.615Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 18, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:46:12.290Z: Workers have started successfully.
    Oct 18, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:46:12.322Z: Workers have started successfully.
    Oct 18, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:46:46.125Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 18, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:46:46.297Z: Cleaning up.
    Oct 18, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:46:46.378Z: Stopping worker pool...
    Oct 18, 2020 6:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:47:34.844Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 18, 2020 6:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T18:47:34.885Z: Worker pool stopped.
    Oct 18, 2020 6:47:40 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-18_11_45_16-14382144578797941294 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e07827ee-b069-4c8e-8038-0f9351dc1664 and timestamp: 2020-10-18T18:47:40.698000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.429

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 18, 2020 6:47:41 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 38.021 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 25s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/nygeazvxj4472

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1130

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1130/display/redirect>

Changes:


------------------------------------------
[...truncated 272.09 KB...]
    Oct 18, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 18, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 18, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 18, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 18, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 18, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 18, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 18, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 18, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 18, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 18, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 18, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-UIbwpUH1rKV9_81rKsEYEtBO9uAa9JwWq7wuwDAG5FI.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-4SoKIpOsIvM-bgH4aMcDDxRGjgYzF3wCKVuq97cpgVU.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-UIbwpUH1rKV9_81rKsEYEtBO9uAa9JwWq7wuwDAG5FI.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-F6RSYO5Cd_QYlIImd70KpzMDyilZdi7-SmpUfldwsS8.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-L387xwFsN5IaIl4THfp3YeA0-EA4vTopNa_MseN50ZM.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-TzEFH7pHNhdIWadARIklA2T89HnzePoTBA_O37WJt7k.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-FIc8GoBG1BPs1VYEpESdv0He30FM6X7NldqLSY6AOLo.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-kNZEenaKLrDt81J6_yhpRnW9KkitBPAeqYX2qB_rKtU.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-IGmojWrBNPlFMy5aS_ohyLi-wZfmegcN8af7AW1BZ8E.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-q1BOo0jLJzXT0X-jFsnR9TFgZKmpyGmwuq5lGDvHv-g.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-h0slC5bRM60mGu8zUuRvxW2BUW98APSPCb_5pGAwNg8.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-wzqifWxmSwVeLjQ_76OlNcfzbjjEfOICFpo5MrrjbpU.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-sbUvpt4M4spNa8tmqhO4KUqkUqqlLliL7nN5x9-v96s.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-3qU2axmKqq9YgjeTj-hAfMQaohQ1HnKLU8qhi1tg8Zo.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-IEBGm3Y2GS7gq04jGNscsAz2_0dfkLjILp3mJhV0nTM.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-CS4ntFUZ_r5dhDFd5Jm1akgRJlQOISbjbcvWI_0nh68.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-Y1emYkIr7ym53C-okSxMIfmLfWhDK2XcHiMpWu6DQCU.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-rshDSjjth-BCXETHeeWx86iOk_K0FZaUCQ8w6IlDnek.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-EXUo01kUM0pLMo0wNhlc0_hnEtcIy2v4IvUB2pY25ds.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-s6zKTZgz1gCoKi3o0JXUzinZn1O5Zb3fMG55CXRkR3I.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-UGwsE0D-S6ZYkD1LrolcNQaM_U9AoPal3qAN1wEQCrM.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-r3WWZ4HMmmDYjw3dqcwuy64q6XFYopZNVTD70kBrCs0.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2245824606906015532.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qsg_Z090W8hT6yE8UgFY7J1VBIKrt5M2nO2ysxKWBcw.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-_INvjBBbc0B-WbyFiI4b8taCSv6y0F5NWc4Cl4922yE.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-w9taT2ksM9Q-RqZnPqocyAtJtwlWkRdQnvlVnzWGD2A.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-k82GT2RuYqmcMvPnBF2SReMgqSc7PkjBzf-q1gFSiE0.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-HtGRCwaWbwlXS3RFTjJ2d9abunbafXIIqLvmAWRjGbk.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-pKv-aAGLTFaMOopcaOfYvSDwFWt2xNp45ZnySc4wm6g.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-5hu5BdJ8hsHpPSDLwhYUlNRWlO7wZYNJuL3NCYPdkyE.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-D-YP45Vv19xI3GM1V0ra1teKIQHjp2KNt3tQbGwE2bY.jar
    Oct 18, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-LG40e98xeRXLCrMb64J4zv4HG2QuqIsvSH1wlH3IBJo.jar
    Oct 18, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Oct 18, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 18, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 18, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 18, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 18, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 18, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96019 bytes, hash 0bfb8a55169e9600da419140b50ac15b41217b88c3268f6a67733362108e92cc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-C_uKVRaelgDaQZFAtQrBW0Ehe4jDJo9qZ3MzYhCOksw.pb
    Oct 18, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 18, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-18_05_45_16-4291198803537892704?project=apache-beam-testing
    Oct 18, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-18_05_45_16-4291198803537892704
    Oct 18, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-18_05_45_16-4291198803537892704
    Oct 18, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-18T12:45:16.928Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 18, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:45:25.705Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 18, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:45:26.515Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 18, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:45:26.555Z: Expanding GroupByKey operations into optimizable parts.
    Oct 18, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:45:26.595Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 18, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:45:26.669Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 18, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:45:26.697Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 18, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:45:26.722Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 18, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:45:26.750Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 18, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:45:27.117Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 18, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:45:27.199Z: Starting 5 workers in us-central1-f...
    Oct 18, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:45:41.538Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 18, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:45:50.297Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 18, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:45:50.316Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 18, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:45:55.615Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 18, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:46:12.808Z: Workers have started successfully.
    Oct 18, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:46:12.836Z: Workers have started successfully.
    Oct 18, 2020 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:46:42.956Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 18, 2020 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:46:43.197Z: Cleaning up.
    Oct 18, 2020 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:46:43.269Z: Stopping worker pool...
    Oct 18, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:47:41.579Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 18, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T12:47:41.614Z: Worker pool stopped.
    Oct 18, 2020 12:47:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-18_05_45_16-4291198803537892704 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ef08c5fe-eaa0-42f1-9e18-03a436f84bb5 and timestamp: 2020-10-18T12:47:50.191000000Z:
                     Metric:                    Value:
                   read_time                     12.34
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 18, 2020 12:47:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 46.621 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/pnu3uspqf7mbs

Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing files not accessed on or after Sun Oct 11 12:44:20 UTC 2020.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 0.149 secs.
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
No JDK named ‘JDK 1.8 (latest)’ found

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1129

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1129/display/redirect>

Changes:


------------------------------------------
[...truncated 273.22 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 18, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 18, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 18, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 18, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 18, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 18, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 18, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 18, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 18, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 18, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 18, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 18, 2020 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-N0WWLobhbWM2kFa65mf9zg9wcwChaWCZzLaOMhp5LFQ.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-vMVkXSQwGhpBsHFGSPcuRM7fOQ7KiExJRZReqn8Bec4.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-_7pHUVT0EjlMGiFZyAgxGTtsWktJdduY3U7K7U8x2sk.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-EoHOnzR8jAbk5JjCeWYkYtu5-FyXBE8A7hqA7cAV-D8.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-8tck8ZCFIUVWx8ifGfI5jgbz7pnLW9vJZrhAYWw8RTw.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-ngUkan3UX_c-Nb5_oUBj79VCRgKnYUiX9OLFbaSGV8E.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-n4lcexlRzqvswpwjCURITmDG2rNqxhjZN8W4FxI9pE0.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-eBvOXWZkX1JIfcGhIittDT0U7R7yO4Tto4WdckfuEew.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-88HxIs61S8mGDJHqxDfMI89ON9dzuhjImXsbL23QXxc.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4683057855908257680.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-MbBUGOSJ9KCaY7CLvgGVR-HS18HIHQSErAyD9P7tsmM.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-V6nNT3W2ul_Kp8UulCpH5ts_x1qmo1GrvD0R35uxH6U.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-yJD7JnpSp5YrWAveJhtjGx0Ws3AkCZFQZFZhEDREc7w.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-ucSZIq1x6EU6md1_JynGpF_-P9xOWDGY1xt0mTNMIbY.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-gufbxgkmvhlc5OoCdqVVFsZ9yq83HfnaiHJD6v-kwSk.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-SqxBlDecHpzyCsutKbwrleBdssrgBkXwrmSICzP3LO8.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-1vdj2W6JiT4dCU9susYzNDrGs8Ff06m9N7BCnYwxOMY.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-7rVOJpdSboVKUPjiL9DemOh9ZUNcT8UarDnqzHszSwU.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-8JSQBsVXdKV8no3fJaPZRCGqbPhk-n4dSsLrADx64uI.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-pJDkM4FDmy7vavhUNYPnws5YRj8KWV0xipldE2RnT4I.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-p4Iydbu3dUxTlBLWQ9piPO9gPxTx7LaLQr7BlqOZLBs.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-u3EzlbwWor14owhHwJaULj6YNO_2vfJf0K811nlUq6c.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-hnV7Ib2SAdlzK6ZkgXltoHG_c86rpko46yje5-QwzEk.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-N0WWLobhbWM2kFa65mf9zg9wcwChaWCZzLaOMhp5LFQ.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-4aQsTyEp6x-C0TYd6niOGm8LffDX-vRqSW298X1y2oY.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-TV9U03SZPDgEqJEuWDlNRFnpwOJIH6wDtzthuosIby4.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-FSkWauQuy8ci88D9bdzT4wTu1E1f1-D5A8BK8n_1cFg.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-u7AD2GQJntA9m9MJ-qXpUJq0pbTJziynj5KRAIhYt9E.jar
    Oct 18, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-GSvyowYtxmaZA3wR3YiwOBCAx1yWPX6Zz2Dfc9KiPKk.jar
    Oct 18, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-nPeICRFNecjqlkpmP4sk-GU6sW27ZYnkbannlnXpXBE.jar
    Oct 18, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-TACMj3JC-6MIPQBRsqCLPRa5PTPecfL80_9pqSKBG4w.jar
    Oct 18, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-WeEozdq88yEg5R3W0rPcQevHM0YKVFWteobkhdwvB0A.jar
    Oct 18, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Oct 18, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 18, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 18, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 18, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 18, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 18, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96018 bytes, hash 4f6f3a3abd45a2dfb09f74ff6670ca612046e70b31cb4f905b1c0fee0840f234> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-T286Or1Fot-wn3T_ZnDKYSBG5wsxy0-QWxwP7ghA8jQ.pb
    Oct 18, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-17_23_45_16-6669475804889606947?project=apache-beam-testing
    Oct 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-17_23_45_16-6669475804889606947
    Oct 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-17_23_45_16-6669475804889606947
    Oct 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-18T06:45:16.278Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 18, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:45:24.660Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 18, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:45:25.283Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 18, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:45:25.316Z: Expanding GroupByKey operations into optimizable parts.
    Oct 18, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:45:25.345Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 18, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:45:25.419Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 18, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:45:25.437Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 18, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:45:25.470Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 18, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:45:25.496Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 18, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:45:25.905Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 18, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:45:25.972Z: Starting 5 workers in us-central1-f...
    Oct 18, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:45:50.479Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 18, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:45:53.279Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 18, 2020 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:46:10.438Z: Workers have started successfully.
    Oct 18, 2020 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:46:10.470Z: Workers have started successfully.
    Oct 18, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:46:49.141Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 18, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:46:49.309Z: Cleaning up.
    Oct 18, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:46:49.401Z: Stopping worker pool...
    Oct 18, 2020 6:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:47:33.279Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 18, 2020 6:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T06:47:33.315Z: Worker pool stopped.
    Oct 18, 2020 6:47:39 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-17_23_45_16-6669475804889606947 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e61a5099-2c6a-4b8b-a2bc-363ad08846ca and timestamp: 2020-10-18T06:47:40.016000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    18.909

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 18, 2020 6:47:40 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 37.101 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 24s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/fhzoxgiuu3hes

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1128

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1128/display/redirect?page=changes>

Changes:

[chamikaramj] Updates Pub/Sub runner API transformation to preserve None values


------------------------------------------
[...truncated 270.65 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 18, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 18, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 18, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 18, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 18, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 18, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 18, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 18, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 18, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 18, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 18, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 18, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 18, 2020 12:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ENBWMflMLwm3HPHGTSmaZeMm1WUeAlHQp6Cyewm09cA.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-znrf4c8rjIJNE_qWnx0_a_ffC_7NEMGmjZympeej7po.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-bCixN-tAT8dALxLfP_lYtrIVj19eqzgubSFPxC5QDPY.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-FdlDlduNRggHyMsxsJE0YzR2g_yVPie9FSgAJxCe8JM.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-ztTSB0cCUY7aGfaIWsxdQAN8nXN6nNXpz0csZy4d0CQ.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-okSDU3pmVrK3oVqsDmkvw0bIavPnvfeB4XvUQLyLlsY.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-jKeBzmm-3aB6W6q0rB7IGPUSyF5UIZ9gmqtOUTLA-Vs.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-j0Onb1f2acg4ywtXHNO0KyPi0-ODgKkr1LNksCpAPkw.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-eNude7RfifIpvEMW4pQ98lsvklUkHkQEaRXevsYu7lQ.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-fbhlbtMY92WNSKJ6fPekwb8CdvKEjBOIbgS3ogqA1Ug.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-vvVensLbtOUna9vdgoO-RcT2rTjo8Z5fUX2R4vUL7WU.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-i3mZPjRU4PCBwlSPKy72RssFXwlOo9lPmzE2pI6grQQ.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-87sNPvpj5DQV8L0Sct2lRKUum-WS44pVsN3yqZSsJuA.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests--RcuVgbrDrO8DgiAvSu1ULlCq3TjCOa6rHj_DAnTvwA.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-wZK1eSUnDrt1z0bsvCxlvvVGqFWlmsSAFRFt-ONN6bk.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-xZqMCPQ_iii4b5gMlwSD8zw9gz2-Ta9wgsCRveb_WY4.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-06k7ym2fpLHPBO2Z9uKcCF4TZjKhczw8CmkiE8KsShM.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Ve__9Vx1lLtx_Vd-7InBRBdxNpL_TmWT-d7m21Jm01M.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-CXFoidhK5Z8NqlygHOrdUXtPMuBwe0Br_D_92w17DjI.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-MINp2LhDgpQbjzV9khLS9RbhZpLLlwftZpFE3tzIzD8.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8166945942055329903.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-E9GPcNEyw1J1ML3CStc9WG8KF7YWndonKFcmt_yffHE.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-WUHPj2E4oaQsmzP8Et-1oLsb6cY48D7JSAC67ab15PA.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ENBWMflMLwm3HPHGTSmaZeMm1WUeAlHQp6Cyewm09cA.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-IxeQ0qSdtGXcoD6VMZxEwDNwZcyIsJhAZ2RsgT-jaLA.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-Fjo9DkqhUcd6wL7epCloH63P-EX7fUCQ8bUJ4WDdMZg.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-npjZbwwOGWF-SgAa9VBjzGpPzd6oEBbXsMnaVep-_P4.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-cxqJ-DeNNQ2vqGWw_TbwVM9wBMJOhdSVz-3ymh9oLvw.jar
    Oct 18, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-6RAepnIxnnjQtHux6qWjxoqYCyYnMZuE4QS6GWPsAI4.jar
    Oct 18, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-gq8I_4UfBThuFjfz8n2SvNQcoc-X1jhqhXe5lYcp2n8.jar
    Oct 18, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-tknEZPwHrnvb87_jQHA-2llxizEAzCOCLMOi5UXB06o.jar
    Oct 18, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-hgQ7DeRdGhqMqWyyVyxIRbSmp0m74A1d241mOIx6O08.jar
    Oct 18, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Oct 18, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 18, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 18, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 18, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 18, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 18, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96018 bytes, hash bc82305c0e32ce4fac70ddc479c87f15e8cb34145e9b29fa32ac4f987c319ab2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-vIIwXA4yzk-scN3Eech_FejLNBRemyn6MqxPmHwxmrI.pb
    Oct 18, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 18, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-17_17_45_18-10087190935765222593?project=apache-beam-testing
    Oct 18, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-17_17_45_18-10087190935765222593
    Oct 18, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-17_17_45_18-10087190935765222593
    Oct 18, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-18T00:45:18.902Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 18, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:45:26.883Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 18, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:45:28.048Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 18, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:45:28.075Z: Expanding GroupByKey operations into optimizable parts.
    Oct 18, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:45:28.110Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 18, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:45:28.173Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 18, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:45:28.198Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 18, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:45:28.221Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 18, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:45:28.255Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 18, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:45:28.567Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 18, 2020 12:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:45:28.639Z: Starting 5 workers in us-central1-c...
    Oct 18, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:45:48.845Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 18, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:45:53.540Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 18, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:46:14.537Z: Workers have started successfully.
    Oct 18, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:46:14.565Z: Workers have started successfully.
    Oct 18, 2020 12:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:46:49.149Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 18, 2020 12:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:46:49.281Z: Cleaning up.
    Oct 18, 2020 12:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:46:49.389Z: Stopping worker pool...
    Oct 18, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:47:38.968Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 18, 2020 12:47:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-18T00:47:39.008Z: Worker pool stopped.
    Oct 18, 2020 12:47:44 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-17_17_45_18-10087190935765222593 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): fa2dcd91-a5a7-4525-9ee1-8a00ca6ac11d and timestamp: 2020-10-18T00:47:44.215000000Z:
                     Metric:                    Value:
                   read_time                     13.57
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 18, 2020 12:47:44 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 38.88 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 27s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/eay4wduo6pmqs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1127

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1127/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #12864: [BEAM-10650] Windmill implementation for


------------------------------------------
[...truncated 275.02 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 17, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 17, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 17, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 17, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 17, 2020 6:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 17, 2020 6:45:33 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 17, 2020 6:45:33 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 6:45:33 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 17, 2020 6:45:33 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 17, 2020 6:45:33 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 6:45:33 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 17, 2020 6:45:33 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 17, 2020 6:45:33 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 17, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-x8ZCKtDEixAv-ZpuoxwVk7ehE6zKxq9MmZu8jOtwDDo.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-dCe566GpvIGP73RjA1_CrRBcyAJmq4Wgqm3nqxmYuFk.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-KnJcdY4q7iJJH63QwUl428WGFLaZaL9-G4ghACPqkks.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-2i4SIag0d8n3X0IkWNlJQtm-QBsn1c-h2_1A0XoLlK8.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-RMlHCn4vnA9O-_VrVUK1BCriD4X_ufz895uDqa8W6Nk.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-dCxDye0EtJNVMtiLZR7wDqfc_Td2mNzW-UW1sRG8vQE.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-4bDWj35Wly7kL7S8V8Fppio2DWPpXTDghfzCGKLQW6E.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tvUeU7PPAwxOPFddgup8IgAtrfa0oY22N3T5zekWfKs.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test673756528643725389.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-UKYhk23dm7XACc29iLCHRTxvOUY7gfwziGkV4G7PBlo.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-1RHs85mvVWlXEnMUnbAMRKdY2rB_krtkk2oviCmCJO4.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-teeBLwVtvLu_1BI51O0XdCB2YHo11gKgcvl8zNnxRug.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-f3tvy9tddqTuqBmi89n3CJOOafimgvGGOws8G3yI9EE.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-sH5N2htjp9ccQjhajoKb9FuS1EfVy-l4_eVGnUgbFd8.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-V5hnNG1DOwxso-MXPujYdkXE7kP8-1rBCVOIeurRVso.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-WwkxIbmpSGb-ZupPLsr8B_xe6rFNxelShlDLv1BMzys.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-x8ZCKtDEixAv-ZpuoxwVk7ehE6zKxq9MmZu8jOtwDDo.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-sdcT2Ega0tA-P-HFQSGTirFRfg1bTBHuEwNYF0r1z_Q.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-Vf7yUmyKhTZSvKl5OC19ESAdVHZhr_r2LDichHvM7yk.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-YffxLxE4g2GCzNqIQhkAFDfY0sxB1NaWhElBUMGtZSk.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-tXJsRCC9sXWMN5pfCZnkuNy7WZA1ex6hfDrJaUBn6D8.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-C2jbkZYcFhhLafhutVVwN44tXUSuDaFPS59gK-VX37c.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-hpftPEOq6u1HyfM-YTsQXD6GaX45oIZEaIUymT3mUng.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-jIhkXcn0jnwkavh0PJXwLuqbsO2nAyJdvGCPDaCt6O8.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-q4wt1HSBSvtM3QZ5LLE9cbstMyVZIFiHvWdz03WBSPY.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-olF-ACEHXzFUYsblKgBhWq9r2qC7UxKjXTQxtm5QZl0.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-W47QZ3mrcj5ijaYx-vBYPtgsJBHNIvdaaBw-mwUYgNc.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-RJABNT2vynT1E1Q3e_iR5OUcOamTbqiA7M-bEwKZCsY.jar
    Oct 17, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-xeVvg1Hl6S4e_rmqYUrMTe_KTvCjLCOdqM_aYv1xP4g.jar
    Oct 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-KMsgIfpkK-ufsiusRw2C-JO4I1GoTblCiHge0zpR-aE.jar
    Oct 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-tRKJCTgBBlJdVz_etv8ZFTOzDB0IAxgK1CnK77Da0UM.jar
    Oct 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-NENdKjsvRJKsOg77RYd8HFBgIX32caMfWMBov4QEHGk.jar
    Oct 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Oct 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96017 bytes, hash b0bc63cafe57097c2264f28ec56749777407253ec73a36176ee6b03a1e1fb37b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-sLxjyv5XCXwiZPKOxWdJd3QHJT7HOjYXbuawOh4fs3s.pb
    Oct 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 17, 2020 6:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-17_11_45_38-6584566969518964922?project=apache-beam-testing
    Oct 17, 2020 6:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-17_11_45_38-6584566969518964922
    Oct 17, 2020 6:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-17_11_45_38-6584566969518964922
    Oct 17, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-17T18:45:38.092Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 17, 2020 6:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:45:49.509Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 17, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:45:50.064Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 17, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:45:50.145Z: Expanding GroupByKey operations into optimizable parts.
    Oct 17, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:45:50.187Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 17, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:45:50.295Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 17, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:45:50.345Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 17, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:45:50.378Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 17, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:45:50.408Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 17, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:45:50.923Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 17, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:45:51.005Z: Starting 5 workers in us-central1-c...
    Oct 17, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:45:59.845Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 17, 2020 6:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:46:13.942Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 17, 2020 6:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:46:31.414Z: Workers have started successfully.
    Oct 17, 2020 6:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:46:31.436Z: Workers have started successfully.
    Oct 17, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:47:02.033Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 17, 2020 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:47:02.229Z: Cleaning up.
    Oct 17, 2020 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:47:02.309Z: Stopping worker pool...
    Oct 17, 2020 6:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:47:52.029Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 17, 2020 6:47:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T18:47:52.083Z: Worker pool stopped.
    Oct 17, 2020 6:47:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-17_11_45_38-6584566969518964922 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a6546624-4adc-4f32-b9f4-11bf60bc892a and timestamp: 2020-10-17T18:47:58.675000000Z:
                     Metric:                    Value:
                   read_time                    12.043
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 17, 2020 6:47:59 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 34.425 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 44s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/4l5wf2ymxwldk

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1126

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1126/display/redirect>

Changes:


------------------------------------------
[...truncated 270.81 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 17, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 17, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 17, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 17, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 17, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 17, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 17, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 17, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 17, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 17, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 17, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 17, 2020 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-9Atf9KyvgfsQYCdVsVON4PcCjA7yB9TD6rco7CjiRPE.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-pR-9U_Md2g1EUt8sQa_PFAxm8nPSveGQZIWEaXEcaPE.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-PvVoXwfCEoS8BLZfzXimZnBD994aMgXkubpNCAYpPqY.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-n1G9RGbIs1dkv0Ip7ZGdTZLOKodbg2RtO5rYyULLgSA.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-qRRAdUqIfpRiGyvqCbDHDAJtCzTxI7ocRyPotLSnbt0.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-PqMaj6q1VQqqReJz-RM91dgBL0Sgz2_UAQEtl0VA4MQ.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-M9dgAnD9F3U5vmb2pNorPTjUzztHA5Ldk6ihBXJZaIA.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-jOnQbbjg5h7uJ20MZW2dxI-5o8W1p9iNbsItX6Dlx5g.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-k4NXIz9OuA2Es-lwbyKiieqPEXZ5ma2NSUp0czUbJyg.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-vBKi5hX8rlBuhFrPuOXIOuY3uXapYCO_9pRskA0Om4s.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-WCQ6YuR-0KSC-uCYsCQWOJgvoBa6qX_mThVatvOSQNY.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT--bMpVQBOf53LDlvmntZ_ncZytya4ydE6MKCDT5d-D5M.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-3gNvecaePE_yVmR1dX7wnTPLVIuNYMXXRTQt_mHh8JA.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-EN0Fr4Zzp-P9H08l3tOkMKobSkX6wAbzoQ4ZRvrFDiQ.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-mxk9iAtewX2Hen9M3w1hwl47-d2zWM6pnTe6GO0fteY.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-6-WQTMSRT6ctx4MK5j_hLQiu-lu-PLuOUifK0ZKCKPY.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-exVhaCmmiIixij-ssbd5y2U4Rr3j76cI09Z0W8A7HJk.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-m5Ye2LOSF2yXiP42yaKiQ2-_vuScJEEk3erENk4M1zM.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6627940611543952301.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-RUOpBVG3CuRHJp0vwtI4xQe9wAbu3wygGqw5yRKCrDE.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-1x7QJFeKTHZhszvDXVdLC1DAOP4Icj_viOtIRq4XG1s.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-AgWr5DZxVmikEXLfD3YIcV705D6OgaHh_Z7wWM5WE20.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-rzGyb0rfMG9IxM1TyryO7nUaspk0jz1kPKxkg1YdU5o.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-Ay7BB54TV_ErPz0l40EMbXfUuq3nYK73itq_pj0xH5Q.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-DrFc3JgBWOIcdVLQaW74PvN6DTs-EhIVVHR9l4iC6Sc.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT--OTjVE6YT-7Z2afyzEQWAcb4Qrh6dqmbOzKaf_pOT5c.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-YTH1wHPG9Vs4KM_aWeZeBQUAHgirtkVkaITxw6kEj5Q.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-u1iXcT2h4ZzNdQs-XsfUWB66tvYerDTnFeW8ae_wHDU.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-9Atf9KyvgfsQYCdVsVON4PcCjA7yB9TD6rco7CjiRPE.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-bVE4oWNtsMLTlTj5n6KL6zvb-3dYhRuCC-RkG8TFmKY.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-G3WfgzyQCT5XnIIw1wLkt3IPfw3KW_mXhKy1M_a1KU0.jar
    Oct 17, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-oi4HrzUAyasXZqIgPo9YJf6K1P0nXf8UD-nIirTpytk.jar
    Oct 17, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 17, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 17, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 17, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 17, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 17, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 17, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96018 bytes, hash 0b90d83c06b2a2ecd0d61c7917f43acb558d2bf76de8c041101718a0fc9ab0d4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-C5DYPAayouzQ1hx5F_Q6y1WNK_dt6MBBEBcYoPyasNQ.pb
    Oct 17, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 17, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-17_05_45_15-752966595231442107?project=apache-beam-testing
    Oct 17, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-17_05_45_15-752966595231442107
    Oct 17, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-17_05_45_15-752966595231442107
    Oct 17, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-17T12:45:15.844Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 17, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:45:24.269Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 17, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:45:25.400Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 17, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:45:25.446Z: Expanding GroupByKey operations into optimizable parts.
    Oct 17, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:45:25.491Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 17, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:45:25.577Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 17, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:45:25.641Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 17, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:45:25.684Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 17, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:45:25.717Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 17, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:45:26.307Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 17, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:45:26.409Z: Starting 5 workers in us-central1-f...
    Oct 17, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:45:50.464Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Oct 17, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:45:50.483Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 17, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:45:50.551Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Oct 17, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:45:55.913Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 17, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:46:14.388Z: Workers have started successfully.
    Oct 17, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:46:14.429Z: Workers have started successfully.
    Oct 17, 2020 12:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:46:47.843Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 17, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:46:48.128Z: Cleaning up.
    Oct 17, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:46:48.298Z: Stopping worker pool...
    Oct 17, 2020 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:47:44.285Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 17, 2020 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T12:47:44.362Z: Worker pool stopped.
    Oct 17, 2020 12:47:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-17_05_45_15-752966595231442107 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 99658f75-ea83-4402-bbfd-0bb14a74be30 and timestamp: 2020-10-17T12:47:50.799000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.459

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 17, 2020 12:47:51 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 48.486 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/iufstgp2nc45o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1125

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1125/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #13055 from [BEAM-11006] BigQuery failsafe function


------------------------------------------
[...truncated 280.14 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 17, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 17, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 17, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 17, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 17, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 17, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 17, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 17, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 17, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 17, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 17, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 17, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ri5rUDrj2YqfdKljlpDaa4QyXPz5jVhYvMEzSqkrh6w.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-YR5f-MPpn6nxn_tCcU0XvwtuUfwgFMN_9FiUlWzgRT4.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ri5rUDrj2YqfdKljlpDaa4QyXPz5jVhYvMEzSqkrh6w.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-dlzPcY5DQvBYCqVUxQGSyJECc8q7ggLGIa6FrI1JCLU.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5625379977985656078.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ZjFcMa4CcLi-MPQ1mhGfkTbVv0EgQt76GDDWHXtNGrU.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-yzChxNkRnhgZArTu9wK1YbhSN1Rb5X_wx9B8YjdcWTA.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-3bAoBaJnI3KVUmkSXZhsIp5_AA-jTKSwcLUlW7nVtAY.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-NLYdjJH0DZpDwiLLbjp5OzHZT3GDE35bVLFJHFWeyN8.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-0H6hpAhvabCbaiuMMU58VfkCnAOX8QfhaF_e0_2mkRo.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-BkpWBQ5PFBnC08Dl9eR4kZJyzEWOqPjiBc0Z98LH3N0.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-xgfX8FDFY-lv9StEvaQvjaTc958EmUujoXixMlsUt3U.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-ltZmZIm7hd4deN-LAZhKi-bhginh3ZnIvOfT1O4dkhY.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-wE3GP6L01JK3zlaSvS3n6SFZRGjsQckyuPEDErSqAy0.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-M3bwnIO-mBmcpocdLCapY3BzOzisj1GCiDw2rEASmBI.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-_oWmvUbyZ2vgI0mTqhdRt6fWIv0nviEnDUGLQ9a8Qwg.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-cKy4UMB7M4NqK1Ie55mbiFDp7ZBWBWqN_DZ1NQlbE3s.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-q4F6QSCSaJ6xWrjNnKx_naxyS5AzdyblYCMcUa_Zr_8.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-vNlWtmjIxuP_MRHUsP9Mlb-lP8A_60qX4-2YKJU6i78.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-eORLOhD1pwTOzbT82RD1aS7PC15X3xLpiSg0S-XKQok.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-0bMaXW0DhrOs8k5G5tq2bT5eH8cS4kEuG74Dkq0FTjA.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-5F1iwYf7wx7Tm-fpFBjJg_Dp9mzPi8ZVnhHCrYMa2c0.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-iq09wI969WDHr6u9Ri2gWmhMpC5b5nm35Xm6LDaRqMs.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-lscW-ZEYCAWBLWkLyl2fkKto6bVqz1tFRA8qOXOXCnA.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-BRqNtVFTRC9Pc_xwqFnhpPNl5Y87WXke7UBtVVqRTf0.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-1UGV7uyU_AAI2Cs7TAX84q1w9FgTkmGmAwEhwoV3gRg.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-gORFeHqjeV8ZLBsKY9VQpeGaxJbst9EdmFrOGO4mcKE.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-glxnnsfwJyj6L42-YnZKFiOKyc6un3XMKHTwBY3fs1I.jar
    Oct 17, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-COLv-8z9lZiDeo81P1rQdT1bS1pZefXM2f7O-4_29bk.jar
    Oct 17, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-k-X5C_2wLT5ctsI6o_SHYNBWoWOazB6DS7tg9TZZYHA.jar
    Oct 17, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-ZLP3IeMR3IZxu_ktUiBTg5WnaJby_VgjPuUL-awVWh8.jar
    Oct 17, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-saWCgAcwxMBikAdX3Eo1Cm96FRUMuOoVYBttg77Pxpw.jar
    Oct 17, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 17, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 17, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 17, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 17, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 17, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 17, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96012 bytes, hash 37cd534429dad34aa84af4c0ecdbc07762d7f6537c868ee27f4464614678f58d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-N81TRCna00qoSvTA7NvAd2LX9lN8ho7if0RkYUZ49Y0.pb
    Oct 17, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 17, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-16_23_45_38-529814306407061760?project=apache-beam-testing
    Oct 17, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-16_23_45_38-529814306407061760
    Oct 17, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-16_23_45_38-529814306407061760
    Oct 17, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-17T06:45:38.081Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 17, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:45:46.550Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:45:47.258Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:45:47.318Z: Expanding GroupByKey operations into optimizable parts.
    Oct 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:45:47.356Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:45:47.443Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:45:47.482Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:45:47.519Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:45:47.553Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:45:48.015Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:45:48.139Z: Starting 5 workers in us-central1-c...
    Oct 17, 2020 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:46:11.554Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 17, 2020 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:46:11.586Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 17, 2020 6:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:46:16.898Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 17, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:46:17.431Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 17, 2020 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:46:30.181Z: Workers have started successfully.
    Oct 17, 2020 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:46:30.214Z: Workers have started successfully.
    Oct 17, 2020 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:47:03.124Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 17, 2020 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:47:03.304Z: Cleaning up.
    Oct 17, 2020 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:47:03.418Z: Stopping worker pool...
    Oct 17, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:47:45.769Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 17, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T06:47:45.826Z: Worker pool stopped.
    Oct 17, 2020 6:47:53 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-16_23_45_38-529814306407061760 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9efc3599-2a10-45f5-aade-a69d0be54cf6 and timestamp: 2020-10-17T06:47:53.359000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.799

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 17, 2020 6:47:53 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 28.911 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 38s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/6jcsngsv54sco

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1124

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1124/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Re-sickbay unsupported metrics on Dataflow

[noreply] [BEAM-10114] Upgrade version of pubsublite to 0.4.1. (#12957)


------------------------------------------
[...truncated 279.92 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 17, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 17, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 17, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 17, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 17, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 17, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 17, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 17, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 17, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 17, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 17, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 17, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 17, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-DZAxComQcQ782ALyCSB3ULcG7zuLpwUqSn0kmuoJfec.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-a_f1NGRBSW7vBZgSsYNzO0yEi9yA7gMBUM0WdXdKV7A.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-aHB3KRXlQOUml9ZiclL40oTG2-R8fheQzQgqOrmzG7k.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-XzQC4TtvrYDotTY8rjaky3L_rj_GmJtrlMBgLcLyT3Y.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-Hdfd7dkj645l8I3Csb_mdySaVp_VF4unm_Y7DAdcJ38.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-WASvIelg9YDUQK_1X0Dyp_RkL4IVRtzY2IfcmJEJ0oQ.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-jzgZCnuUhtE9-ktRW5za_Nc4U04dhwhiSDoGkeaHJ6w.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-gdIKDn5wJj5YRTHyTZKmqpE9Td5kNqzZ-16RWDLwNQs.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-6Q2onbwL_1MEjKKsIFM-G8zaW3-XhSYH3i1cvFgDJSM.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-wtHVUpWueThZYLY2i17a32CYaAHa2bUdh9Qv39NgcGM.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Xh3MaULhlza3iZG2rANyhVsmWWCyGAfbvPYKFkFkcxg.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-DZAxComQcQ782ALyCSB3ULcG7zuLpwUqSn0kmuoJfec.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-MgRYm4LUeNMj_oHU-yCTkF9gJ1RiAPTDfrvdUfgT8U0.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-6gwUvLo6J3jEVleJwpGjs4bJnb0q2C6uJnLW4o8jxLE.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests--MeuX6mHJllCati8CTquLBvakl8WK3Oo-zcTIeubpgY.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-HhWOILwz0Hx1XP7bsJOn7vrNl3M7l0-A5Zy5An2yWNM.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-zvvaKVXYdCZlVhD4CInZVd-QyCS3H7_gksdmz2lxEFM.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Iz4uXATRYonjywqP9fdEiXrWNJORAQzwkcwiarEmjLE.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4660587161064794982.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Pdsa3kk27pt6lvb266gGV3oiVLSYacNc_rfrfMgEK1M.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-GITmUMFqYYTugBWQ3Pl_iTT3uRIklINYepiJycpl6iM.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-I0Q5KnBq-ZNey5p5naKixxkhv1BuClk8qxgkP46vsAQ.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-r4_5VaWhT2A7NUojv5YvWu3JM-pydiJp5XBblVCLddc.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-gStMOw3-amLSQZDQ8--5v1R3gxAVag_uF6_xAsalUOw.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-zDoFg4D2qY8Zva_oLx7apw-fHidpuhRn-Lc9yBHb4nI.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-EwfHdazmQ_Mo1ztpOmARbdNPFzDbq7lBGIF_OqGoG5U.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-bO3jhgBb9yA_HxFLHc-2h6NTi_-oIu9XZ4aIcISvRkU.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-GSd2ahBKapyU73lG59dM3xmKCUjB0RjU_b_R2EKo50U.jar
    Oct 17, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-34NLfC9YWururrZEBe3K4ZiRE2QPRLRu3O02yLrigk0.jar
    Oct 17, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-EHcEYi96q_sPqXfjxyB7fc0eyHpH3mTvd8xAgcgbtv0.jar
    Oct 17, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-Z71nfwqAUstglvtOwtx5uMFeCEVboYSNQZaU8h5zUPU.jar
    Oct 17, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-IV6ifMWl_8lbCT70NC_rVpGmm76J3WjKHH8YM6PJ70o.jar
    Oct 17, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Oct 17, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 17, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 17, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 17, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 17, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 17, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96012 bytes, hash 5b536d50ce06d51e45c158ddde0824fce1b2dae1aedc23ba1249314cd43701cb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-W1NtUM4G1R5FwVjd3ggk_OGy2uGu3CO6EkkxTNQ3Acs.pb
    Oct 17, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 17, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-16_17_45_42-1756506156964357501?project=apache-beam-testing
    Oct 17, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-16_17_45_42-1756506156964357501
    Oct 17, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-16_17_45_42-1756506156964357501
    Oct 17, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-17T00:45:42.186Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 17, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:45:50.241Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 17, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:45:50.833Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 17, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:45:50.877Z: Expanding GroupByKey operations into optimizable parts.
    Oct 17, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:45:50.918Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 17, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:45:51.008Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 17, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:45:51.052Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 17, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:45:51.087Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 17, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:45:51.123Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 17, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:45:51.808Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 17, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:45:51.905Z: Starting 5 workers in us-central1-c...
    Oct 17, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:46:16.553Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 17, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:46:16.596Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 17, 2020 12:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:46:21.944Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 17, 2020 12:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:46:22.906Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 17, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:46:36.619Z: Workers have started successfully.
    Oct 17, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:46:36.653Z: Workers have started successfully.
    Oct 17, 2020 12:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:47:13.330Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 17, 2020 12:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:47:13.507Z: Cleaning up.
    Oct 17, 2020 12:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:47:13.628Z: Stopping worker pool...
    Oct 17, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:48:13.658Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 17, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-17T00:48:13.716Z: Worker pool stopped.
    Oct 17, 2020 12:48:19 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-16_17_45_42-1756506156964357501 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9d9ad1cc-e870-441e-8404-515e0973d8d2 and timestamp: 2020-10-17T00:48:19.590000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.286

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 17, 2020 12:48:20 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 52.579 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 1s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/wewy2jekgwz26

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1123

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1123/display/redirect?page=changes>

Changes:

[Pablo Estrada] Improving performance of file deletion in ReadFromBQ


------------------------------------------
[...truncated 272.13 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 16, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 16, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 16, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 16, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 16, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 16, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 16, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 16, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 16, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 16, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 16, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 16, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-620BTn8p4A83l_powjzFsbaYYfJubmKT7db_xP3BZEM.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-QRaPVxCckKUM824El2WozL29kJFqKpz0iL2Al3xL3BE.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-5IpWQi8s9iu7O1qk3UO6JRGUE9IybbJh-pf7iJQPb24.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-mo92d3iOi3q1u9fL0XbAqYbjpvph8Bon4H_PgbxvVHE.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-MQZzComAD5eMgVTWik3OLLNYIX5kIvT-MIkxkflIKwI.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-ys3yUq-py5j1ko26qWvWxFOZqi84KTguG47EdJ15v_0.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-NXbVJHr40g7l1QMjUDA3S74EG8Kx3UWhdaS0QU-ZgJc.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-G-zOTsrbZzLUf0tNJ0lgM1O-2jHW669B9NrzOwYy6Z8.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-rDpFpPL_FedFvsb_ksxlWnXeL84ySRfvyybXwunE1m0.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-2TGEkwdv53wrRViEuKMQSujTJoGkd9r9CHYOWMAH_5E.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-Y5WLYx1LSyVlSijDqcDySWfQTgVVWlzz4EYR20IHEGs.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-HwZ9-rsWKxTjIHPfDSFVlLljWfqWGciXwakptKWmdRU.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-Pc7japgp5NPvajfdtIbNt_I4kthXcgO0qRTwzJat6J8.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-bskMxH1L-mHK6I6GNmW1Xs0V_k0I111qw6pUPT6Kv0Q.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-TKCRr5jIaYjLb7hxaQoi5irdOQ_VkevNULV3BdrndEQ.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-I4t7Mpos8KrQ0VP35_PLOHbb9S6yzi3KUl0Dvf_LrrE.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT--Jru3mBFe_5KFhi_znd4izdsylg92NeYXuK_5e6QErs.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-7py531qT6ErHeDxREH2-f9TRu18NGXiNQHaCF2afS78.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-WgaP2FcoVPWPJAdWX1t6cuCXGOexlW00q-uOaCV08AA.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-7d0Ie-g3-zN3rat4hxMbNNrMXr6v3n0uHK93Kb0COBU.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-620BTn8p4A83l_powjzFsbaYYfJubmKT7db_xP3BZEM.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-L3nN1xEi4AmfjdPAqH93BYVsPnjvw9YpuZ2douM4Svs.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-v2mVWcOHPGmgML5Jo4hGXsJrewqRAGYjblf0PPtZkuM.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7357294090613190539.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-64U8eScjyc8dQKaeBv54Hg3BK0vAX6LqNrcSfOgiruo.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-JaumkKXkvX8nyBl9I9UEJqQuwisLs7-vGklLS7uK_Mo.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-_dDEOvLpBqIXgiF45U5YmGPsYotDiIwcVOUMJgjKb6A.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-sh5lfDOGFgL0g_Zdbwl8xjTDS5mVEQ2p7jEoYctTshI.jar
    Oct 16, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-qtrQqTctKn1SI8VeZIf_ta-ZrlRjQWxCMNYWzUOdWgE.jar
    Oct 16, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-D0Yt-_Q4NF8o-UxZWkdRyEvpxzQ-HzkciDgXC-JXsSo.jar
    Oct 16, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-YVj8RLJ3wLRQH852jaFHDDUGIw6tLvqsdoIuQj12FkE.jar
    Oct 16, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-XSiu_jOD9ZdNuZowb8_CCV-BW6rrfHSAQ6oY2959sV4.jar
    Oct 16, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 1 seconds
    Oct 16, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 16, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 16, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 16, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 16, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 16, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95594 bytes, hash 01b1fdf6d9b49ba563ecdaf2ea7a436700bfe23fa2315b3f550dce007ad47167> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-AbH99tm0m6Vj7Nry6npDZwC_4j-iMVs_VQ3OAHrUcWc.pb
    Oct 16, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 16, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-16_11_45_20-2190685175492657375?project=apache-beam-testing
    Oct 16, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-16_11_45_20-2190685175492657375
    Oct 16, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-16_11_45_20-2190685175492657375
    Oct 16, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-16T18:45:20.105Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 16, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:45:27.271Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 16, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:45:28.086Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 16, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:45:28.140Z: Expanding GroupByKey operations into optimizable parts.
    Oct 16, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:45:28.176Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 16, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:45:28.257Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 16, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:45:28.277Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 16, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:45:28.321Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 16, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:45:28.345Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 16, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:45:28.721Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 16, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:45:28.801Z: Starting 5 workers in us-central1-c...
    Oct 16, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:45:47.282Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:45:54.862Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 16, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:46:16.261Z: Workers have started successfully.
    Oct 16, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:46:16.300Z: Workers have started successfully.
    Oct 16, 2020 6:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:46:52.554Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 16, 2020 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:46:52.727Z: Cleaning up.
    Oct 16, 2020 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:46:52.798Z: Stopping worker pool...
    Oct 16, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:47:44.335Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 16, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T18:47:44.377Z: Worker pool stopped.
    Oct 16, 2020 6:47:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-16_11_45_20-2190685175492657375 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8094036b-9c65-4460-bc3f-d8fcf1a09b83 and timestamp: 2020-10-16T18:47:49.388000000Z:
                     Metric:                    Value:
                   read_time                     17.66
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 16, 2020 6:47:49 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 43.984 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 33s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/plp7d5j5q7awa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1122

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1122/display/redirect>

Changes:


------------------------------------------
[...truncated 272.28 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 16, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 16, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 16, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 16, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 16, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 16, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 16, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 16, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 16, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 16, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 16, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 16, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-lUFFxbMOoebJ203BlMKHGqNDaskNOVne2W-zTTtzwmE.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-Z-AznFKyWT6dHRLB4GwJ0TeZH87wOOR-LqAiWmD2DXQ.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-C3WzOnQeAU0G_jjHX725oRubsm0OKupNBYOmnTa9s-U.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-T-gEYYgA6ujjsThyMqs7iTOjfDmn2vJVRcAL1-l-v68.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-8LG8fEEz8vPF38pCM7ts3jV3O-f0lX-oZ4fRobkUCCM.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-XVi1uIxquJ8dO36k1HUO5HOGe9jDaseVQofR4PoW-3Y.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-lUFFxbMOoebJ203BlMKHGqNDaskNOVne2W-zTTtzwmE.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-rCJJnWPNx90FWDCPVqW_lSq0_ze2al7JcRPxJSLpgqg.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test625852888066363889.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-5L1Ez-u45LO-Rn78BvXRQS1pIOsmfsQzVyaXWStc1js.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-SOpRT4RpRT4OJi_aHaPqiFxByosQRz7XP2ZQuo-wS5A.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-IVUcBNoAJvmB1zrJur_hjD1LWhb-wtB6zKqNdVWTmMc.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-SDKGYUfax_5PpakOe4-Agn5YmBHfs3a_H5FxeNV42no.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-G6L92lQxmg4wwtIpU4ikX46VV6j8pvUE8qeA3YSb_Q4.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-vG4kQi_WkoaFY8I5rgwX74F1OqgmJ55jhmQUeJQbxu8.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-gHaWoTfT85KZHBgy62FKwQHeLzgKDSTTwEWWb_h_20w.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-aJNCXOTAzcvfn2KOfBPHRiahkRdNlTBIzW7PZ1uLeeg.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-iPpTdoSwONEUFyNpLBhl5haoHLFSNWAApKVqCsTPpdU.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-2izyyNT1yN1jn01zdh-n1K66_Caqzw1I38x1AwGRS-w.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-Su8i7s2xGFoF14GQn5fDKIA5khghH9XYa1pd94a93Fc.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-w5ZwwDKj746eDsg19-u06nyTmDoMfHEkRZaYEIcfCuw.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-LUFHjdb6-CRpnl3cti0LlXQKu1SDjaMDaly3UNcE514.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-k6NEl4OVH8YOcc5UPKMrOKwI5oxGDatNlUJ6WNKhqSw.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-c6lZfal2jExp006rRZBEEKuRbP4ZMRqCNxuzo5inU_k.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-RfOjqrVlrrN-qqSj0fdLcUjTMve40lRaNaJw_WJkSuE.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-wBzRObj8s2gZWpymDkjA2Y_IrTZN3Pj9zs3fBi8-0Mk.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-hhzd_dEtxaUp2f_7DXeV27VLWCCDnrQ9jTr_ma84VNk.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-OB7hLRi5SAxvH9omKuaInV_2t4sGD7y7IJBJA_jKL9Y.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-XkN1LF74uTiTH98GVpUsn0NVS-YnlwdM95ggosQegm8.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-N5DuTXnfhKmPpD75bBN3MVacVuyrZrgQBCWSp1Kyczw.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-Ia-sJqN8PDhpwlCNz_RwIrLc2D6TBUg-cbCfqRoCi-Q.jar
    Oct 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-tjRQoQkCSxG1IIk54YMZM5R13sDIRgXt546GhZQwyxo.jar
    Oct 16, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 1 seconds
    Oct 16, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 16, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 16, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 16, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 16, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 16, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95593 bytes, hash e63e846a2749c517a8d9e1320e1990a14b96060170d899356ed881d56e26d0e1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-5j6EaidJxReo2eEyDhmQoUuWBgFw2Jk1btiB1W4m0OE.pb
    Oct 16, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-16_05_45_18-503159456898151226?project=apache-beam-testing
    Oct 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-16_05_45_18-503159456898151226
    Oct 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-16_05_45_18-503159456898151226
    Oct 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-16T12:45:18.822Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 16, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:45:27.968Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 16, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:45:28.618Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 16, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:45:28.667Z: Expanding GroupByKey operations into optimizable parts.
    Oct 16, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:45:28.707Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 16, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:45:28.799Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 16, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:45:28.847Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 16, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:45:28.889Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 16, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:45:28.927Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 16, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:45:29.579Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 16, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:45:29.725Z: Starting 5 workers in us-central1-c...
    Oct 16, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:45:35.868Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 16, 2020 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:45:56.450Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 16, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:46:18.406Z: Workers have started successfully.
    Oct 16, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:46:18.451Z: Workers have started successfully.
    Oct 16, 2020 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:46:59.743Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 16, 2020 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:46:59.937Z: Cleaning up.
    Oct 16, 2020 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:47:00.070Z: Stopping worker pool...
    Oct 16, 2020 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:47:50.341Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 16, 2020 12:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T12:47:50.550Z: Worker pool stopped.
    Oct 16, 2020 12:47:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-16_05_45_18-503159456898151226 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 42a63921-d711-4975-a6eb-30dfe5acccc9 and timestamp: 2020-10-16T12:47:57.284000000Z:
                     Metric:                    Value:
                   read_time                    21.387
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 16, 2020 12:47:58 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 53.442 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 41s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/6gi7pluk4agga

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1121

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1121/display/redirect>

Changes:


------------------------------------------
[...truncated 273.01 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 16, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 16, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 16, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 16, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 16, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 16, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 16, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 16, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 16, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 16, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 16, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 16, 2020 6:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-T7s0nhdwIEaYFgMLwZiLU8WGZGy5cojIegPoZjuQuBI.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-sEao79tNcZwmLwPuG00HnzXedSqDvOdoJd_XMVBtOD8.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-TnsWc6lbsmWOpsva7-IcXqQGX-9gJhz4z_NwJEFNCnw.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-HrsYvP0JTFXd7QiFrdwc5JgNOPBktJ8Oi9dduwzBHl0.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-L967aR5vHRHdKMWo56V_MdrNlJu92sBV_WOAo-S2XJI.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-umlBPa_x_KryV5f1LRkgh0gSsi5nWNcNWsg2N0_rBm0.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-woNqpBCOLFfHfIzjCFsBfjPyZRuyeTjrNe8uqc2ay-Q.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-1RpC9N6_QqiJIpgidoDdCShMkj7uDsTP8qO8_neB9cg.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-t8CQ8OTkY9HcC-GDLgLdHF4nWr4ZJDIGJLgRaDZPPuQ.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Ovv-5WhH7R3ilP58XgvIFcMXchGA-rOZt1mUA_U7zKU.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-91BdRwF3Z1rxTeTfjS5ovZgdap-v2ZUh-EoCwJNSUpk.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-OrXv83RTn4nlPl5in1bZItz-MkXfWZjFAKMPPoWhzHI.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-XuGrqImgTkqtkSJucPI-tngU2-PRAOw-U0VpQK-3xT0.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-BKsnOTK67-fNi_1lv9-3lWANy-P7sK-ag4GM4QGn6pc.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-07uQesDf0yxYRykvqALPchaQhmyLPmfmpDrL_NYr0mg.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-rncuVYsvSX2WX7c-8eMS3gT5DqC1Y2fLr42kmcp24wg.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-6t1Scyc-VabbEDvXzpwKARJWhCSpbN4jTeveeMq1Wck.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-TDLVkjBPBPq4JRNXX363hIo46kwUba_zVl_7IyosH5U.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-AAfswSCvkf7u2upue4uuxNysY9EzNLmimdow9up_GoI.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9079044167026149414.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test--tyc0o3s2HAGxRodXkIJ_yFC-JkLiR1zA5UaizibGgs.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-RnTO8FQCjezi1sX_YaQ3TDln5Ze3qMvAEfIt9aRIGjM.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-vJ9v6eT6xDzgVCXMpKDzUOmB5WDiFHIu__TBO1G7WOw.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-ATq0Rzi_5YPteJXOpCWiXb5eaGkyjWPI309bQIBVOU0.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-T7s0nhdwIEaYFgMLwZiLU8WGZGy5cojIegPoZjuQuBI.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-shQfhKtfRQwJxkiFpWCErFYmtlcD3M3SiUcE8JeUxaI.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-f8J-yIa-XHaL19gro9Hkzh3mfFu-tV7s7nnc5JJ5vbw.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-qHq5wV1myiE5SV06uOo7z8GriXJx9IbDTa4zTVC-r7o.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-0TMdjI6-M0ZIX-SGiYxhs_dbFN2Cb7RAHS89T7Ncw8o.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-HFTeP9DnXMctSZXtQXBhbesPdZImy7CtWUU1LSpLU64.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-1yqGSYSkB4RUTHZTipTN1rwVaASyD-oztkCwtNBToOw.jar
    Oct 16, 2020 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-ZG0AeTpdoImBDIU_rqoqoUuifTnU-tcKiKubnCBJt8U.jar
    Oct 16, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 1 seconds
    Oct 16, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 16, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 16, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 16, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 16, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 16, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95594 bytes, hash 766531ae1d35f3865517d39e0af6e9a319fa2f396533efbc14289b23b3d9ded5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-dmUxrh0184ZVF9OeCvbpoxn6LzllM--8FCibI7PZ3tU.pb
    Oct 16, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 16, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-15_23_45_14-14838791646431684016?project=apache-beam-testing
    Oct 16, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-15_23_45_14-14838791646431684016
    Oct 16, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-15_23_45_14-14838791646431684016
    Oct 16, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-16T06:45:14.040Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 16, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:45:22.392Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 16, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:45:23.216Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 16, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:45:23.250Z: Expanding GroupByKey operations into optimizable parts.
    Oct 16, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:45:23.287Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 16, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:45:23.356Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 16, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:45:23.393Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 16, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:45:23.429Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 16, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:45:23.467Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 16, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:45:23.942Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 16, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:45:24.028Z: Starting 5 workers in us-central1-c...
    Oct 16, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:45:50.866Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 16, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:45:56.380Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 16, 2020 6:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:46:08.847Z: Workers have started successfully.
    Oct 16, 2020 6:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:46:08.885Z: Workers have started successfully.
    Oct 16, 2020 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:46:40.781Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 16, 2020 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:46:40.940Z: Cleaning up.
    Oct 16, 2020 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:46:41.030Z: Stopping worker pool...
    Oct 16, 2020 6:47:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:47:28.597Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 16, 2020 6:47:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T06:47:28.663Z: Worker pool stopped.
    Oct 16, 2020 6:47:34 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-15_23_45_14-14838791646431684016 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2d2914e2-8ccf-40dc-8b1a-a1a8c306d4db and timestamp: 2020-10-16T06:47:34.506000000Z:
                     Metric:                    Value:
                   read_time                    11.423
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 16, 2020 6:47:35 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 34.734 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 20s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/6p7w64hveuaq6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1120

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1120/display/redirect?page=changes>

Changes:

[ningk] [BEAM-11045] Upgraded chromedriver-binary

[Alan Myrvold] [BEAM-11067] Update java license script to include source for GNU and

[zyichi] [BEAM-11066] Fix dataflow runner transform override ordering for

[ningk] Added check for alter_label_if_ipython when input PCollection is an

[noreply] [BEAM-9547] Add support for `DataFrame.assign` (#13102)

[noreply] [BEAM-9547] Flesh out dataframe groupby (and related) implementation.


------------------------------------------
[...truncated 273.13 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 16, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 16, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 16, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 16, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 16, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 16, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 16, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 16, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 16, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 16, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 16, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 16, 2020 12:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 16, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-qtTr9bzJycTA_C6tDrqE2xjLETDoSr4TWJY885wQ-UQ.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-1u4NjpHVSIRwXFvwoc90O-ilrjC-OOQdNqhHPrz3kxI.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-lpdRKv4XwlrVpyhc3C2hjW9dIHyHvyfSSSxwgfklRx8.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-5ClI4jLIe3WDNEUvRXBJ7Qk4eDXi7UD4QHX04qwxlpA.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-MAdnItGNHz7k32PPx7sL1ADokE7e4cAZWx1kHn1bfDM.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-qtI3x_iv1B9qoQsLu6Sxi3x4_lgLUTuH4vRMNB0juE8.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-gdY2NJrYeDseCkG0qXzMdx_mGjY8gLSd7SNhO32aRAw.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-jaGvbOTiaKER4MMYRzUP_XpPRMr_x4YPWYIQEaKN9As.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tZZ_AGGkhwBWrVAYytUzolzP-Extul1B_y3LbJdBFlM.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-ubRGMseLNRBu1mqI_iYLrNkB9YvfDWLhVQVXlty1AVg.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-PFtvpsJsEvmas8X1reKkRsEyz6SNej82fOn0JBXnhT0.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-ttqgY83JIQkjg3hrMMZbP41tiAqE7SWyx7iz7HfpgIA.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-ZsbBroGU7bgRT7CiwIoUnxSiNfSQFo5iDOYa3uCmaTY.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-WTQMnNR-fRSkACqC9oUHEooCmove5T8yJ5kZdgdCypU.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-LLiUUAoGoHCrZnIwi5iFID1z7hBTseos1tHSsi5KrgI.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7796703049938051395.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-yyP8wLP0giEwE3HkAGDrTLCkuMg9iYV0KO1cKphTRag.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-NMkCpDeNGmyZ0_NOQHISniXr5knu5nMNw3dOgSGW8tU.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-b8gd26yop9wwy6rj2Dwmc9UaLNWXGd1Fl-Ct5mxJAMY.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-53dq-ws5wKJJ3hbeZOIuk509qUheYBf_drCh8vO0tFs.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-0BvYRTspViYiflfV6Gk0I6FCESVNDYmWet2CGWxdRNw.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-N3z93cocLJ2LsLyOb_cf5qdCmImEYFnOrkjOzMkBuLk.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-qtTr9bzJycTA_C6tDrqE2xjLETDoSr4TWJY885wQ-UQ.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-b1iFj8w62WUbDHMQve-UlX4PTnDZjwZz5q0M1r_alk8.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-a2l6T9JWQq7cKa7Nle26CN4JLTElCln7DCMGTKvTlfE.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-kZ1jVO5Zn-qBLZdQZPf5W1Wj29P7_4qZu3oKOFZzQus.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-BKc-ncSYMs_2IKUohcvmrB5K0eRSMoj5vLgSFglAO3U.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-7gxlj7Bfjvp6UUVJnQHD_nl7dpEwQSpKBF-VUkJv188.jar
    Oct 16, 2020 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-8-wyJ5NPyTmz7KXTv4HKevANGYhu2FZUZ1f_3EE2O_s.jar
    Oct 16, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-MZgzFOrwSh_j6oUTrzxsDcS47eu4mo_KrEDCKLiaxXg.jar
    Oct 16, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-o-dGyz2mneWo4xyESEATVGx3MmeslFvCoifKCO77LEY.jar
    Oct 16, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-UlxLrTYMLOr5PqBYcrb1Kn4ohG4DJii4zGreyhMAqhI.jar
    Oct 16, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 1 seconds
    Oct 16, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 16, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 16, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 16, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 16, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 16, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95594 bytes, hash 0505ebc2735298675c679cfbf7e13e7349c3ce31ef3ba717b52f44bd1f8788ec> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-BQXrwnNSmGdcZ5z79-E-c0nDzjHvO6cXtS9EvR-HiOw.pb
    Oct 16, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 16, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-15_17_45_38-11188676983290405470?project=apache-beam-testing
    Oct 16, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-15_17_45_38-11188676983290405470
    Oct 16, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-15_17_45_38-11188676983290405470
    Oct 16, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-16T00:45:38.188Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 16, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:45:47.722Z: Worker configuration: n1-standard-1 in us-central1-a.
    Oct 16, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:45:48.531Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 16, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:45:48.576Z: Expanding GroupByKey operations into optimizable parts.
    Oct 16, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:45:48.655Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 16, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:45:48.958Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 16, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:45:49.055Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 16, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:45:49.114Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 16, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:45:49.142Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 16, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:45:49.571Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 16, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:45:49.662Z: Starting 5 workers in us-central1-a...
    Oct 16, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:46:12.045Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Oct 16, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:46:12.074Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Oct 16, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:46:15.100Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 16, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:46:17.336Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 16, 2020 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:46:35.454Z: Workers have started successfully.
    Oct 16, 2020 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:46:35.499Z: Workers have started successfully.
    Oct 16, 2020 12:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:47:08.615Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 16, 2020 12:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:47:08.823Z: Cleaning up.
    Oct 16, 2020 12:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:47:08.913Z: Stopping worker pool...
    Oct 16, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:47:50.070Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 16, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-16T00:47:50.126Z: Worker pool stopped.
    Oct 16, 2020 12:47:56 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-15_17_45_38-11188676983290405470 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7ffb2d18-3540-438c-a3b1-70d18a057dc0 and timestamp: 2020-10-16T00:47:56.623000000Z:
                     Metric:                    Value:
                   read_time                    13.007
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 16, 2020 12:47:57 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 34.403 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 40s
107 actionable tasks: 70 executed, 37 from cache

Publishing build scan...
https://gradle.com/s/ujdvhb57clv6w

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1119

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1119/display/redirect?page=changes>

Changes:

[Chad Dombrova] Update mypy to 0.782

[Chad Dombrova] [BEAM-7746] Add type checking to transforms

[noreply] Fix typo in twister2 docs

[Chad Dombrova] fixes

[piotr.szuberski] [BEAM-5551 BEAM-5595 BEAM-6090 BEAM-6091 BEAM-6092 BEAM-6093] Update

[noreply] [BEAM-10587] Support Maps in BigQuery (#12389)

[noreply] [BEAM-9547] Allow wrapping multiple return values. (#13104)

[noreply] [BEAM-11046] Update Snowflake docs about supporting AWS/S3 (#13057)


------------------------------------------
[...truncated 286.27 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 15, 2020 6:47:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 15, 2020 6:47:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 6:47:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 15, 2020 6:47:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 15, 2020 6:47:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 6:47:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 15, 2020 6:47:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 15, 2020 6:47:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 15, 2020 6:47:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 6:47:43 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 15, 2020 6:47:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 15, 2020 6:47:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 6:47:43 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 15, 2020 6:47:43 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 15, 2020 6:47:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 15, 2020 6:47:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ibXCDQ-QA7q-LqzB_JgtBmQ2fRRCCrhZ9UKOrk7URXc.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5539346785516107782.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Xe_4CZZPDzI4oY65WBdvikpNHULr8gJLhKON_9GnDD0.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ibXCDQ-QA7q-LqzB_JgtBmQ2fRRCCrhZ9UKOrk7URXc.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-b3kxiUWlVBZDMnJpap6Jywa3k2WcIpOXcAGwyrtRcyc.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-h2xx2dPcAHC8FlnQ7TlFPaeQWRPiZacD9YCFcvlN_f8.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-WxlKm8elAXQ8v-1buwWznmbG6f5N4gaZuhUSHls3QgU.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-9eyabIde7wdtOHL-Sak6WRn9zHjA8YmxNqsEwMLaU0o.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-X7SlR_Y_83YtdM14bXNzOXQxYKsEc9qiXHN6YL6X32A.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-DDH8HcnTkW6RkV3aCsAvaebL2hZR0UyIvAZVjoecncw.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-K9Nx8RYFow8n8DGUITNypwuTWtaLO0vZkJVVNr4UcM4.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-w3BqaNH2BzpFeLEnzqyOCt_YFeKHPI689uNhc1kLM5I.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-bunT0khIywK2QqSmCxDGYk30yvB25fubwlt4iAtf0Ok.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-rT5UWckxjzkQi3h_G6zc20GDwao5hVgqsLuqwdB1pYg.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-3pK_NXK3Y4D4j7KidUS72Y-caqllW1QdiL_thveorio.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-uANoUzxHR4uRJBd6qKqiEiGEtsb3dHO0fHbdvMWSCOQ.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-VenLzdqpHWuWqu6eSWShxny4u5_CiQssLEOhJSA4gzI.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-nb-27QqOpspyHHG3bb1x6PPtl1Z4eOySjnMTbZIpAOA.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-TZ5ZpqGiQoGk4_YeNvJNyG7lmlM2kQGqd2r9yU8333M.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT--FVs_94KKSJeSgshJbD2i-oRViZf76Xzwz8d42MphfY.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-fo8QkVrgoVGpzryBksyLDbrSNU8ZA2gVCtoT1xL2M-s.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-SxmghvJ7Fwh5XOu70Nyyt0imDunwTRV83hxetWj7nMI.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-XDlNd1IL1pTgTJnEIZ2fmZ1L_wXi8tJIDIPX5oV0r6E.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-6Oz0LIJ6-RTy90PpYRp7O9PYdud1YmZXgeOn5A7NpXs.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT--oOV-tLa8duAUAuSYOT11I4YgLTjqspWaz0jmlyzHi4.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-TqzPP6gfAN9diO8avG_j9Pc9e22bZ3QEvc6jfnvAH3I.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-yoTiciLhtyqxo-IVjgWgwFydgYtyYZDFIA6vJocEHmo.jar
    Oct 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-3VGsuuf07GH2LKxHrwzQo8_nIHAta64VQkeRDSrk8BI.jar
    Oct 15, 2020 6:47:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-8r0Q8qzXnmhz6mrlNRYo6cXODvO0hogyr6yBFWpWtPQ.jar
    Oct 15, 2020 6:47:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-QcdrwGc4L4MOhZhC2GcQ4wBWj8X1woVAu0RX9T2IuTc.jar
    Oct 15, 2020 6:47:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-rLTjaugP3bNFoggsqF0diocYZWtjP46g9UMqN7EORek.jar
    Oct 15, 2020 6:47:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-2LIZTvxvw7r-TZDaOOCX_-HN5wFuUNSN1MQhSHZNcLQ.jar
    Oct 15, 2020 6:47:49 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 1 seconds
    Oct 15, 2020 6:47:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 15, 2020 6:47:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 15, 2020 6:47:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 15, 2020 6:47:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 15, 2020 6:47:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 15, 2020 6:47:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95588 bytes, hash 844f01f62e721dd745ba918159d3da5fc507f1407d0d5c6a0b5b190d4341d6ef> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hE8B9i5yHddFupGBWdPaX8UH8UB9DVxqC1sZDUNB1u8.pb
    Oct 15, 2020 6:47:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 15, 2020 6:47:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-15_11_47_50-5100809927032730109?project=apache-beam-testing
    Oct 15, 2020 6:47:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-15_11_47_50-5100809927032730109
    Oct 15, 2020 6:47:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-15_11_47_50-5100809927032730109
    Oct 15, 2020 6:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-15T18:47:50.369Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 15, 2020 6:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:47:59.153Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 15, 2020 6:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:48:00.703Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 15, 2020 6:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:48:00.746Z: Expanding GroupByKey operations into optimizable parts.
    Oct 15, 2020 6:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:48:00.776Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 15, 2020 6:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:48:00.854Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 15, 2020 6:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:48:00.885Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 15, 2020 6:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:48:00.917Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 15, 2020 6:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:48:00.969Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 15, 2020 6:48:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:48:01.499Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 15, 2020 6:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:48:01.572Z: Starting 5 workers in us-central1-c...
    Oct 15, 2020 6:48:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:48:24.865Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 15, 2020 6:48:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:48:34.026Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 15, 2020 6:48:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:48:56.437Z: Workers have started successfully.
    Oct 15, 2020 6:48:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:48:56.464Z: Workers have started successfully.
    Oct 15, 2020 6:49:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:49:33.440Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 15, 2020 6:49:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:49:33.619Z: Cleaning up.
    Oct 15, 2020 6:49:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:49:33.713Z: Stopping worker pool...
    Oct 15, 2020 6:50:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:50:28.153Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 15, 2020 6:50:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T18:50:28.200Z: Worker pool stopped.
    Oct 15, 2020 6:50:34 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-15_11_47_50-5100809927032730109 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ee7f85ef-7a72-46d3-aa78-3898f225ffda and timestamp: 2020-10-15T18:50:34.889000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.418

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 15, 2020 6:50:35 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 3 mins 3.188 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 1s
107 actionable tasks: 76 executed, 31 from cache

Publishing build scan...
https://gradle.com/s/5lhsqick7uvkk

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1118

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1118/display/redirect>

Changes:


------------------------------------------
[...truncated 270.86 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 15, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 15, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 15, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 15, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 15, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 15, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 15, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 15, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 15, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 15, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 15, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 15, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 15, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ay-UhAkyWpQKOX4icFNx_bhgmlqXq_b5p4hnrrE59Rg.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-bRkgRpK9Y8soegnzkaOCFvHhRjgQfRURsIUdlW2Pt-8.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-icGkS15JNoHdJaFZOMQSlIfsy5PNprZXOQZIo83aNq4.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-egVnhjjKfP8NZuBoHqJQoIROduYg55CDxMBxbYxknXU.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-gkJGueGLfizREtSe3115omsfMfG3nqGlJD9Kkz3xQSc.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-g4N9Hmycyf16cS6MzjUCzmG14nB0HsDKY_w2xBY7gbU.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-pFUXa1JFTdBcn4CtFH2MCLpJGvkFIF6zWcHV4GIW7yU.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-PbpdK3EmJ78k6R0hCQ0iqNPmN6ekQr1MkpKKJtonosM.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-eBGyJljScyWzPqNtNN-MLVnoiV49-GXqUsZd57oNf5Y.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-DHKcvfEQjH093tfzDyMQlKT9E0CiwGvkeZ8amlQ4fWU.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-uXWg_7nmEX-QLjTeca2kv4HAfHVDBCEyCcNPsTV3Qm8.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-YbZv0aj0dMyrbHK5Vw-61iVpjX1N9hS0s_MqMmhQTys.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1195362980615037559.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-pggAeVCccc5coMtMF-uJnRPJkpF_AjrcDx_UYssLF3U.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-hl235kP6sYuNpy1EacWqqZq1MI9nDt3DgYPlQErr2oE.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-_sKdItea3R1uNpivicTN4VXl4uIVHPGDVpJ5b4nSA6Q.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ay-UhAkyWpQKOX4icFNx_bhgmlqXq_b5p4hnrrE59Rg.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-KLCKBbhgIUODX55qlHDC_zFZchZ0VqP09l79_vb86Ok.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-xGYPuO53QRSOEs-KWNqgg3IleI94cWVLmeXTCKsY3fk.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-feDfYYBeJa4j18WXqoUFrJdzEO7TfPBLwE9-TvXn2Yg.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-v5kUy2nhjdDEbzJRA2rct-yu5I6TzxM_Bhb_Lntnh7E.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-8AcNXVcvCY86_7AsiNBlCoLdvafIa0KMFA4pmzA5vQw.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-j7gJJqL_JZx00mW7aYTz-Q1yJCVxSPBD394RtECkZmg.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-0hP1lrwmXvvIBZPcchn84fbanZakhv9fEHgabYxrrb8.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-ieJrLuH4KhyytrK0cvy7r_-RSBlZtUsWFrUoxBDvF94.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-HR573OLr97FafAHupOLZ8_QzDVpCZplGKFP3MKEnk-M.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-97UiEG2vtN1xkK0_gxDszo-EaCkPtkhxCX7WBzN_t50.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-_NmEnJ2Xe7kMSLmSbrVlu1_OFyfE-G0zvFYo1cq574s.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-sRxapo2EDAGPSSkSkC9oWXW7NvTaXFEj_teP5qW_SaE.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-ifH_m0AA2TM0HIjUuV9XHW2bibUGgx2UMNnzPQDdUQY.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-7DqiiKizNPmXiKUI7pfH5-fm9d3Kpc2eOKG6PFOddF4.jar
    Oct 15, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-jldZLsmOTYY1BWAsm71caM16dRALObuppVngavk6Fds.jar
    Oct 15, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 1 seconds
    Oct 15, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 15, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 15, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 15, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 15, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 15, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95594 bytes, hash 187db7768f66db89462b5b952667b9a0b0be7d606e2a7bf0898ec990c0993047> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-GH23do9m24lGK1uVJme5oLC-fWBuKnvwiY7JkMCZMEc.pb
    Oct 15, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 15, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-15_05_45_17-6752516875800643636?project=apache-beam-testing
    Oct 15, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-15_05_45_17-6752516875800643636
    Oct 15, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-15_05_45_17-6752516875800643636
    Oct 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-15T12:45:17.203Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 15, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:45:29.405Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:45:30.874Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:45:31.109Z: Expanding GroupByKey operations into optimizable parts.
    Oct 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:45:31.209Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:45:31.305Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:45:31.337Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:45:31.372Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:45:31.415Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:45:31.826Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:45:32.296Z: Starting 5 workers in us-central1-c...
    Oct 15, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:45:37.169Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 15, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:45:57.782Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 15, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:46:13.702Z: Workers have started successfully.
    Oct 15, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:46:13.901Z: Workers have started successfully.
    Oct 15, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:46:52.606Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 15, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:46:53.594Z: Cleaning up.
    Oct 15, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:46:53.889Z: Stopping worker pool...
    Oct 15, 2020 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:47:37.970Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 15, 2020 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T12:47:38.183Z: Worker pool stopped.
    Oct 15, 2020 12:47:46 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-15_05_45_17-6752516875800643636 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9cd106c9-3850-45db-8e0e-b7065364556e and timestamp: 2020-10-15T12:47:46.715000000Z:
                     Metric:                    Value:
                   read_time                    18.749
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 15, 2020 12:47:47 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 44.063 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 32s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/htjyu4cnwkgwc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1117

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1117/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Disable unsupported categories for Dataflow streaming

[noreply] [BEAM-10967] adding validate runner for Dataflow runner v2 to Java SDK

[noreply] [BEAM-10959] Fix circle buffer. (#13123)


------------------------------------------
[...truncated 271.04 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 15, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 15, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 15, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 15, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 15, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 15, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 15, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 15, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 15, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 15, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 15, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 15, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Xjm0dwoJjPGDgQazYbS5_WNbzorx2D-4Aj0zDRnYjdk.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Rp1YMPvSUjWcyobeLhbdlYI8XrNdmViqChRgFC7lmKI.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-HEh6iIf-ffcbjGhUv_BEi6l7nIayhffsFTj9ISlCovE.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1159135383349019613.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-mclDSkHFEVWtA_c4wgZGT6drWZcNh3-TIsom-42YsCg.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-RhN1WWgxzzfpDNoAS82_fIDMWg0dsJMJLoBSL6g4cS8.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-KcI1DgSkILY--YzduACyGB2XdwzJl_pS_ly3FCjMjKM.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-Lt6pPibIDiT7SZ_XPkxXMthf6mBssOUJBXVOSdSE52w.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-9G4-3jGFxdnnxAN178221OYOwAsVzWU-NLFgMGxp-Ok.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Vx1p3G71mCqcgVa4xU_9UuBjneTcA5lF_3zduxi3EHo.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-3GDU8YrnlQC4tlShNezsBvWwZaLNDZVSK7hPGwF5E7I.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-FJ6HxeRI6uINmA6TJcPWrvDIXjjGVxINKBOvxForllY.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-gIlNqTeHO4zze68FVCUH_eFUpYXKbpc_Jc9nMAyfSM4.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-uPYob8VCKcUVoSjUVP15y5DRX6JF8ja9WcgyNPG2sGg.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-t7IsdbKmLerfpj4KH767YiHlbQ1HpX6LYuLUmDGrtss.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-4auA8_aP-0QJVp1fQh_9jx8Cw1CjsorNTbRG2np18VE.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-D11Zrm-He5XgTwv1tltvRi6iDtMbCO87FXKtp_upiXY.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-qINtVTy62bdeei9XBXMlYsbTbpwfZWwD0_ZXd5LsQEE.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-etLx4zFK9lztYpHJn5U6F1tgJujD8qWn3hCysoj8B58.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-bWTtHSEJdhJGq5PN2VRggwn4zmLEzO93RxqDtWZE_qQ.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Xjm0dwoJjPGDgQazYbS5_WNbzorx2D-4Aj0zDRnYjdk.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-f3e1r-74ljFUnKdOxTKnWuvgPLktRXJRjppXmAD5JMA.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-d-9_wUvle3Z3B_xLWnLFD-Z9vLp9kgcfxfC5aI8OuMs.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-6HS8FdfcG4QD98RTHCpCKHTG8CK8r5pWTKaujUbEQpA.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-EAZXIqi3daGhPDkQ6kbdHUgAtDuu_V-mGq1S-nAGFQs.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-FNrTxWNiIR8S6rSIKRkbRY8Bq73_hh5gUGPGYYiGB1I.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-I08tmW6NH7VaayuYIiIWCbRU2csAM_C8KlFtc3mDr0k.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-sKnVeTMh39WFdjsRh9wCIbK02pMVWWWtKPKFuifFmg8.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-WGPnvDwUT1O4_BCQPval5t2w1A0nQcs2keBATcPy5M8.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-ykdpv3j4eSc-DpgLMpYtrtNTAzNCUtIPWLWNNSnDj0Q.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-S9dTObCqx5p6PLFt5-ibTo0xQoT3lSoi4TjLaoRmsjA.jar
    Oct 15, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-5ymroOQJ7w9QtTpWIUaXF-RO0ifigt2Iduz3bg3T6Js.jar
    Oct 15, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 0 seconds
    Oct 15, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 15, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 15, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 15, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 15, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 15, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95594 bytes, hash 4c02e43dc7a99c2600cf25ac50b15ef69749b4d3b6f0537385d259a0a790d935> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-TALkPcepnCYAzyWsULFe9pdJtNO28FNzhdJZoKeQ2TU.pb
    Oct 15, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-14_23_45_16-5963141741332218227?project=apache-beam-testing
    Oct 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-14_23_45_16-5963141741332218227
    Oct 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-14_23_45_16-5963141741332218227
    Oct 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-15T06:45:16.992Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 15, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:45:25.066Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 15, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:45:26.346Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 15, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:45:26.377Z: Expanding GroupByKey operations into optimizable parts.
    Oct 15, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:45:26.408Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 15, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:45:26.481Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 15, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:45:26.521Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 15, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:45:26.561Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 15, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:45:26.595Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 15, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:45:26.951Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 15, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:45:27.028Z: Starting 5 workers in us-central1-c...
    Oct 15, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:45:33.818Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 15, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:45:55.046Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 15, 2020 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:46:12.094Z: Workers have started successfully.
    Oct 15, 2020 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:46:12.139Z: Workers have started successfully.
    Oct 15, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:46:44.945Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 15, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:46:45.184Z: Cleaning up.
    Oct 15, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:46:45.259Z: Stopping worker pool...
    Oct 15, 2020 6:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:47:33.887Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 15, 2020 6:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T06:47:33.935Z: Worker pool stopped.
    Oct 15, 2020 6:47:42 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-14_23_45_16-5963141741332218227 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c65a0c98-05a9-47ed-80c1-2999a1895993 and timestamp: 2020-10-15T06:47:42.394000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.618

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 15, 2020 6:47:42 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 39.581 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 26s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/s6c32nuzosxvq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1116

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1116/display/redirect>

Changes:


------------------------------------------
[...truncated 272.64 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 15, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 15, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 15, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 15, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 15, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 15, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 15, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 15, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 15, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 15, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 15, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 15, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 15, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 15, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 15, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-cGyGCvFwC-5NXxoi3yV12zlWcU-GJMDQnFXLEFJeaoM.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-i8fUL_NFhY4GLgdXi4efY_DxyRrg6R0fH2bX847cxf0.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9063944422703269742.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-OOpBdLei40BeXJ7V1FMaAfsUbzE_ZXStmaZ-4z6nLrI.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-Tu955RRCzNMur90rhAS9eoA7BDcPpkVX8iEK2eypoVk.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-lvRDUtMiaJZnPVYTCHesWZVAPsnvlZeRmFrKiiIGtM0.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-vkivmHBD8ZQ7lgBxVIJaMEIl3pyNC8aeTYYsrneejDk.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-UmcoOkgIOah91HjIkU9nebG-Eug6VpDJIrzKmZdThUU.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-phxyAgDJcji5wHeG40tMdnZlsRhsnHEfHTolEfV8HvA.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-LKiQ_5X18leE35AhS23d9tlD3DI0Enw0AbMfYn7idEI.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-24nvuxUW0Udwb6Kg7V7PwFn6jj6r8wasEEHQnVifhbc.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-n2Cu-jJkgKB1TJLx8-1OrA_QVLsJSWsTGmhImQjPJxU.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-9z1Kr1NTdlT1DC3Dg65vCL4qCYxQOdlnMz8XMU61nYA.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-0a0bHPqbjKYkHteMsanYIWW0PdaN-hQrtt-RZd6OrBw.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-hMFzUOSB4UDpAbMgrg_oC7D99VAi8jj_9GZnD8kMBN0.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-0WHKhtLH06dDuK5JDvnWdgU1ASvEFHx-iqA390ZJZ00.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-cGyGCvFwC-5NXxoi3yV12zlWcU-GJMDQnFXLEFJeaoM.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-CwG2YTL6Htt6JgNpWup6IwsoUq-V7D8WMiZbGhmHyjE.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-vQ9uKFGq2pO_eYPcRYfzOtld7lPr9HA9xVBseNCa1tE.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-UO-EuIfPiAJBVIBjYayttoQ5wV_BmABfmKmkW7MkZzU.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-Kb2Klh5srO9N_VcPaah1Q7uQbG6UIyN-33s2obr3_jc.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT--M_SVA6dJVonFFuQOAM-i46jjBPzGWrrwFg8IHXUqEE.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-xOGvlBz_vNvFV0b2DPa4oU_R7duq2KOldKnw-61xonw.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-adsJKV8Xw_I37PAGlC48o-y1R56jOwqa1kDIdFk-X-c.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-zLomNO69I6oJZJpro6HEFxsCkmiXp70ArqPJDPHmVHg.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-3aAqzzk8PHHJvZZ_J400TIsA0HpfXbrgsMO2oB83Lvg.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-P89Nss0sGMgNsSzUDcK_Sr9PmUkTyd5F5qSG7E0JO_s.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-7uCll7aze1REYT1X1T90kuZGDAaaNqUIVYpH8pecnOE.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-aI_vQo4uciDeEkPaiNcUkE1OuflFddb8ywGZ1dGuqhQ.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-_W7ROC9Em01iitSKzR13YK0K3WeDN2jwFCZ45Ak_ioc.jar
    Oct 15, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-xR9sIotonpEbqK7zDWNSrms0oYTS2LZo6YtozHj3lfk.jar
    Oct 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-kN3fS0SIuIiM5OaaU4vd9eWyemDYOYSK3UKGXdc0ViQ.jar
    Oct 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 1 seconds
    Oct 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 15, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95594 bytes, hash 1d6a03b0bd310e862ff1e39175f2d8457a25bb43b525d179689348233bfc512d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-HWoDsL0xDoYv8eORdfLYRXolu0O1JdF5aJNIIzv8US0.pb
    Oct 15, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 15, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-14_17_45_42-3886320412886917326?project=apache-beam-testing
    Oct 15, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-14_17_45_42-3886320412886917326
    Oct 15, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-14_17_45_42-3886320412886917326
    Oct 15, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-15T00:45:42.258Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 15, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:45:51.213Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 15, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:45:51.899Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 15, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:45:51.936Z: Expanding GroupByKey operations into optimizable parts.
    Oct 15, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:45:51.979Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 15, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:45:52.054Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 15, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:45:52.095Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 15, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:45:52.131Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 15, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:45:52.165Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 15, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:45:52.494Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 15, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:45:52.550Z: Starting 5 workers in us-central1-c...
    Oct 15, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:46:17.553Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Oct 15, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:46:17.586Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Oct 15, 2020 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:46:22.921Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 15, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:46:24.050Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 15, 2020 12:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:46:40.100Z: Workers have started successfully.
    Oct 15, 2020 12:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:46:40.126Z: Workers have started successfully.
    Oct 15, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:47:16.406Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 15, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:47:16.529Z: Cleaning up.
    Oct 15, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:47:16.606Z: Stopping worker pool...
    Oct 15, 2020 12:48:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:48:00.148Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 15, 2020 12:48:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-15T00:48:00.193Z: Worker pool stopped.
    Oct 15, 2020 12:48:08 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-14_17_45_42-3886320412886917326 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 31696c6e-209f-4995-bddc-12a2dc4be30e and timestamp: 2020-10-15T00:48:08.400000000Z:
                     Metric:                    Value:
                   read_time                    14.896
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 15, 2020 12:48:08 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 43.922 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 45s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/3xurznlye6qau

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1115

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1115/display/redirect?page=changes>

Changes:

[pawel.pasterz] [BEAM-8637] Update http client

[pawel.pasterz] [BEAM-8721] Update stax2-api


------------------------------------------
[...truncated 278.78 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 14, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 14, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 14, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 14, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 14, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 14, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 14, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 14, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 14, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 14, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 14, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 14, 2020 6:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-F0aCKzFYpjiBVkroys5v5FpKI66ovEa73onxlFPTu2o.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-XZ1TbwLgLdViO_7p4ryJ37ZO1wd8BJrFQolikX_rcZU.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-_xbdrnJstI0IRQD4kzUfqABppM_qYOXxqHyo5CqQg4A.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-KXNsADbNhNyRlu6yz_b4Puxmcmf-SCzg4mes-GDqW50.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-zKL40hdBaXsJ9jL-z8ZO9_f29BoybG-5pToVIW3wiN0.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-GL2bjVyjt2bIyE7tTXOPL-o7-Q0hh8q3zsNas64mUd4.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-iQNbHOBg_G2INLDViTtDL2lSIWxnlEVoVMY-tyeBvwY.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-33oURGeeaQFGlpnd9YC0mGGy1j9-Ak6NbFrKTyvwFsY.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-8h-LTkpDHmS2mzPxd9F9rj9C_PZgsEYmB8GxCouHAr8.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-9jvOWWnbzdOtjuZYToFgJwY8670YlV0m8JH7u0fEBrA.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-LBRKyd4C13tALhKrBtf_DoYZ6n-6QlHYex1oC640Scs.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-fq4J-4h3QaN7QiCAnaMERkQixt0RDK5b9Ma6UnFl-CQ.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-F0aCKzFYpjiBVkroys5v5FpKI66ovEa73onxlFPTu2o.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-u3lL-8BDmXPpwjbwMBdM8KYkJPKLp83u2f6IQLsh7tA.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test109602746437345310.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-tieHOVcSHIduGDvsCiwBnINSP_n2I1_B9T6LLCsNhI0.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-Bztx5h1lNzMOVGOE7uouFw56BlQZ0-WjSupAyO_qw0Y.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-sKZ8AFT7Sc9lvKIzl5OFExm94sfeDk373LOjel3CqX4.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-XMCWK2l18HOp_dAUWuQ4yPnimg2zYFN9n1O4ByBXmiw.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-2GIC3XjWhxgXmF_HEi8LgG2jzR1g_Jr9ICYM9FRKYhs.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-J39YbGQqqO7CrkLLFLtDFs_3oIY-9KGuJ5_gG5igK4o.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-AskDJ1I7f9dDfiLhLpKN7_r8xSAZJD2hKJc4Vehyx6c.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-UPnLb7tXvBmaWxu_4rxOsXO-MKGXBVVKywyj_5V4s8A.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-3-A8b7hfWHRAyzgxNvPn0-X45cfUrJw9HGdE0DvpIAg.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-oJXnsDGgCj4zGZIfCzDDeExI2KPyDW7ffkfPE8_FGac.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-PBJGjpPYb0hqrT5BsKgj4jH4BRr2fVxRXnk9nmyNdiI.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-qnT6xtatt9AkhGTe8nWXS3PaD7Gqzee3PX98t0sspNA.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tnt5X7umpoCibw_1dSgmMdLhcrCPNPBwqYHkUlXKaH0.jar
    Oct 14, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-QOGmK9zIMnyd3S-On3vvEZRbVGmfj-oER1n11Yvy-cA.jar
    Oct 14, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-oSSJcrPmJ3msTh00NxJsycNDW4XKBsNGU8M6NFpVbrA.jar
    Oct 14, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-yO9X9W6x3GCunfSsbFnlKVT7FwT9Xp2QOkkb8ot-aY8.jar
    Oct 14, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-Ae2-8pVeJQJs0X5BcrYUe0YNi6BRSCeObxa2yO0Etps.jar
    Oct 14, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 1 seconds
    Oct 14, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 14, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 14, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 14, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 14, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 14, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95593 bytes, hash 272826c34846389d1e61b448109c8564046a31f581ec5e84ac955951fe4a7a98> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Jygmw0hGOJ0eYbRIEJyFZARqMfWB7F6ErJVZUf5Kepg.pb
    Oct 14, 2020 6:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 14, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-14_11_45_47-11879328391137315313?project=apache-beam-testing
    Oct 14, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-14_11_45_47-11879328391137315313
    Oct 14, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-14_11_45_47-11879328391137315313
    Oct 14, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-14T18:45:47.095Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 14, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:45:54.223Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 14, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:45:54.932Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 14, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:45:55.065Z: Expanding GroupByKey operations into optimizable parts.
    Oct 14, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:45:55.118Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 14, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:45:55.256Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 14, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:45:55.276Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 14, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:45:55.296Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 14, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:45:55.329Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 14, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:45:55.683Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 14, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:45:55.762Z: Starting 5 workers in us-central1-c...
    Oct 14, 2020 6:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:46:18.686Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 14, 2020 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:46:23.819Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 14, 2020 6:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:46:43.267Z: Workers have started successfully.
    Oct 14, 2020 6:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:46:43.306Z: Workers have started successfully.
    Oct 14, 2020 6:47:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:47:14.686Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 14, 2020 6:47:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:47:14.850Z: Cleaning up.
    Oct 14, 2020 6:47:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:47:14.923Z: Stopping worker pool...
    Oct 14, 2020 6:48:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:48:06.114Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 14, 2020 6:48:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T18:48:06.248Z: Worker pool stopped.
    Oct 14, 2020 6:48:13 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-14_11_45_47-11879328391137315313 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4734bec0-d0c0-44d6-bce1-216cfff4982a and timestamp: 2020-10-14T18:48:13.629000000Z:
                     Metric:                    Value:
                   read_time                    10.108
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 14, 2020 6:48:14 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 41.206 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 57s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/zxjz22sxv25fm

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1114

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1114/display/redirect?page=changes>

Changes:

[pawel.pasterz] [BEAM-10628] Update amazon-kinesis-producer dependency

[pawel.pasterz] [BEAM-10800] Update azure-storage-blob dependency

[pawel.pasterz] [BEAM-8674] Update cassandra-driver-core dependency

[pawel.pasterz] [BEAM-10803] Update azure-storage dependency

[pawel.pasterz] [BEAM-10355] Update s3mock dependency

[pawel.pasterz] [BEAM-5905] Update qpid dependency

[pawel.pasterz] [BEAM-5544] Update cassandra-all dependency


------------------------------------------
[...truncated 283.00 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 14, 2020 12:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 14, 2020 12:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 12:45:34 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 14, 2020 12:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 14, 2020 12:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 12:45:34 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 14, 2020 12:45:34 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 14, 2020 12:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 14, 2020 12:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 12:45:34 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 14, 2020 12:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 14, 2020 12:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 12:45:34 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 14, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 14, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-5j8K-X-WbIlEHOAG26pVMf-m6Yd5PRxZFUI1ZBnNiZI.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-HSmeGZ5smfF_TOs537U-CWqVkJMES-hJk1JraHdjoJ8.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-zYh_H_hxEHdLhNeZhrIMCP5myUH2cEW66ZhIvNUNQ5s.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-8CjAc8SLoHpoNaUlouhrHhUzbLv-bA5sjQTW5vJT7Gk.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-CHFN7cYsCqQ99QqUXNEhke9neNt1nExXIAApBvfV1QI.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-EwOT3hqfIn8ZfXJek_xnWBrGQuhd-uobjLEreTb8-lM.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-vyyANwM-wh3JPMeUaPrLVQD0PZitrNMTbC8sXo7mLeo.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Miuk14bk5UQqDGPtqGBtBJPlIntvr4LHkdHPBXjGJks.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-1BwOeeFBFnPtIW7rewJlxs0owtRkC1FIAc7MRKurk4s.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-xuO4jlQ3j3P0se-T9TnsjxDXzgFg5kijiEL7SRwCIUs.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-D5KgITD8x4XC7r2iXX8lSS13dN-o1cxY2WS9gkcVALE.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-LBXQ-FTiNws96MZx-xh_LA1ObRCZa9v5dt4QAgHClsE.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-aLE-xovwyWIR0K6CS55cq5sDDicxuhG6l4HcaU-yWbo.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-37kts2FB1O6T_zBzcE0FyUqP2NCA9iBdXB33a3x-2LE.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-MDu6npoWup_CtYsT42vihzjslpeVWkf6yWno74tKlPI.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-KDxeHoPGT-CqUU7KoAKAB5O-eaXJwMHStpvPpfUKIkw.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-gs14Y7XhlQ9SUsGqR4CAmuCyOKTm8omAIeToRrtUEMk.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-QdDsRx_uyTE6Hl9Dd9qCb35qvpteu7HUQA5P2WyW9aw.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-mOOZqC-peqxiNEQx15mm-8b1aTGY7LxJASeg_3sBTTE.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT--0i4q47KXy8awzgbvFtDwZpEG260VU04Ti678x45Iow.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3123629641872354970.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-IimAGhbyQ_i0FECHwRajkcwhYWo599UXhHvq3QZopHE.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-5cDwS_VaB8_iFZGx-WpDzcEFpY5anxo_dfizYjkAUo8.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-V085jzuEdCuaW1C4SrSLAO8rSgbdluL3gziY5DusaEo.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-Sr-kV09IMrZdH71ULbh25mB4ZcA-DVTdbyWf6FwUFBw.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-zST63fbVkZIQcnIlOeQKnADDywHew1I7sp046PIm7B0.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-pN7aKVOfRP3KvfSSpuanZZFPeEnezw3robt8l1Uc93c.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-ef3WlzpeScPSWV8kzCgidCqThWJl7rUa5xJF--qmaBA.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-5j8K-X-WbIlEHOAG26pVMf-m6Yd5PRxZFUI1ZBnNiZI.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-afnMnzLPREmmEqb2ugB-emkWvrIc95i_I_YMdowv3OU.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-YAlVemy626TDMxh3fsWKr6erDaQCWYwP2l1PQwFHq50.jar
    Oct 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-NAt5xCIhzI9JGTg0H-71qo4qivsOpneQyp4LC7TegsQ.jar
    Oct 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 1 seconds
    Oct 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95594 bytes, hash f75db634102c313680d1b38cc4764c44af62258bbd8f12594ec9db4d59beb5cf> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9122NBAsMTaA0bOMxHZMRK9iJYu9jxJZTsnbTVm-tc8.pb
    Oct 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 14, 2020 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-14_05_45_39-2189307029295766956?project=apache-beam-testing
    Oct 14, 2020 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-14_05_45_39-2189307029295766956
    Oct 14, 2020 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-14_05_45_39-2189307029295766956
    Oct 14, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-14T12:45:39.900Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 14, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:45:46.838Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 14, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:45:47.430Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 14, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:45:47.467Z: Expanding GroupByKey operations into optimizable parts.
    Oct 14, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:45:47.498Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 14, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:45:47.563Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 14, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:45:47.601Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 14, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:45:47.680Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 14, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:45:47.717Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 14, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:45:48.040Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 14, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:45:48.131Z: Starting 5 workers in us-central1-c...
    Oct 14, 2020 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:46:06.381Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 14, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:46:12.487Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 14, 2020 12:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:46:35.872Z: Workers have started successfully.
    Oct 14, 2020 12:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:46:35.902Z: Workers have started successfully.
    Oct 14, 2020 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:47:05.555Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 14, 2020 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:47:05.700Z: Cleaning up.
    Oct 14, 2020 12:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:47:05.781Z: Stopping worker pool...
    Oct 14, 2020 12:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:47:58.924Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 14, 2020 12:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T12:47:58.960Z: Worker pool stopped.
    Oct 14, 2020 12:48:04 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-14_05_45_39-2189307029295766956 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0115e394-7505-44a4-ba29-dadec16dc1fc and timestamp: 2020-10-14T12:48:04.758000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.587

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 14, 2020 12:48:05 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 39.101 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 48s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/itzmftz7xtjua

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1113

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1113/display/redirect?page=changes>

Changes:

[piotr.szuberski] [BEAM-10885] Add Avro support to Kafka table provider

[piotr.szuberski] Remove booleans, use List of values

[piotr.szuberski] Fixes after CR

[piotr.szuberski] Change payloadFormat to format like in text table provider

[piotr.szuberski] Fix checkstyle

[piotr.szuberski] Use SimpleFunction instead of SerializableFunction

[piotr.szuberski] Remove getSchema() and listFrom(), make abstract generateRow, add

[piotr.szuberski] Remove generification of KafkaTestRecord

[piotr.szuberski] Add missing close paren

[piotr.szuberski] Get rid of getTable() and KafkaTestTable inheritance

[piotr.szuberski] Fix comment

[piotr.szuberski] Inline getRecordValueBytes

[piotr.szuberski] Remove generification of generateProducerRecord

[piotr.szuberski] Add full-stops to javadocs

[piotr.szuberski] Move statistics tests to separate file

[piotr.szuberski] Throw exception instead of returning null.

[piotr.szuberski] Update create-external-table.md

[piotr.szuberski] Update CHANGES.md

[piotr.szuberski] Use ByteArraySerializer as kafka value serializer

[noreply] Update CHANGES.md

[noreply] Update

[noreply] Update

[noreply] Update

[noreply] [BEAM-10987] Skip test_with_main_session on windows (#13099)

[noreply] [BEAM-9616] Allow new RegisterDoFn behaviour to be disabled (#13103)

[noreply] [BEAM-11020] Adding multi-window splitting to Go SDF. (#13070)


------------------------------------------
[...truncated 279.64 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 14, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 14, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 14, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 14, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 14, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 14, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 14, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 14, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 14, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 14, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 14, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 14, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-a2bCuGZkoLwlp65HPx35ji3CmEHswGkPIsaowlT47vA.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-oNMzHlSriUzZy87ujC6w55MncktbmRuC_41Y5TRUZos.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-D-e4xrV4JXEXBp8FJoS5oG6P4eC3ksXw6snU48WzPTQ.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-1tHMrXhmpMuBi_qjfHfm4OgHowQ_aGL16ohMx4-nbDw.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-S6t-Hw8I9Qyboogj10mIUxBsU2UukU8zwn9NgcbUhBE.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-aXVA_EDnvhwayoHCiImy96-dJyYKfM5SUpStaia7x0Q.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-2dW1CxAh2QntIQT8KIDz7zLc4Qoxzw6guWhIUNsILLg.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-DeN9fRnYSHW8p1IpIFTsadbbosibnussBF_xKD8aO1A.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5133716208959396981.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-F07YAF3CUp5O-h_to83gLJsUr23BydcHBTMbXbuq1VE.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-TsIg9Zv63ZdE3JACv93C9GnXC1WceknPGfzaDcheu4o.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-I6R2p26XTmfDf8AcCnpz7L7DJ9VjJzofVLafjmMrLdk.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-1IWUxDw6FiFb8gg9oyOl_iEc7fmi-jKbQgXDc5k2BMI.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-3zzphr669nUatDgz3TN2if4V5diYLPgdM-n6FdXoANU.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-a2bCuGZkoLwlp65HPx35ji3CmEHswGkPIsaowlT47vA.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-R5dgkmaidAnhkGzSGE_9IDlZNZWR51b4b_bNQQ5rkqU.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-T0aC3R0wFF65W7FJA8Omy1joif7FScrRosEym3wikls.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-WWGn5lxMd0DAOa5WSAJUFt5wHG4B-Ze9UIBeNFqSXf0.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-qHalPBB4j2_wYQwcn4vFo-yD4HiK2U3_kGVKCfk37JA.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-gfmgp47n_Pawx_EPH3tnt4WCsuhz2Hj3jSF3OTk_B7M.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-C8oIIPEMKs-bwapjP0bW7T8nureYCXu0eIX3qeqtQEQ.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-yBoSQkGjMNnoW0_jP4ZNdmk2YtbIcX36c1J9DO7oCVI.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-JdFOsVMi43uUORyOZWAyiSH7ih15JsqlEhfhqDFsZ4c.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-6-M7cxnbRa0ae89ROvTIljN0ZAcikNCkMML2o41wbis.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-C0ionD0sFljtEfU2s1Q_lVHSBN_5EDmEmatMjpXEt28.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-8w9aVPhfCRCyIZKYrQymZmRBVZ4pgpq02aEY4GNKEv0.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-w_VEZNhC-gFNvk6rJZYUOx462tslP6NU1PaoFX0AJEU.jar
    Oct 14, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-N-72yz06CotJvm6k8D9s90AKJcQ_i01yuq-rxuvsrVU.jar
    Oct 14, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-fe3jTqbfyVmpJKLvZn-CtTC83aKZdkMFfUDekXI7D-c.jar
    Oct 14, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-9RhvQXkfB-2dGiF8SIhPadm2KiDzs2r3ksutItyUtos.jar
    Oct 14, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-cnB5sudXXJKmCUtCg4sYhBe7RAWckatDutJzhEv7j40.jar
    Oct 14, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-Rkc97JRbym5Il2_9uKpBQmf9edYtKTV7V7OFY35PXXk.jar
    Oct 14, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 1 seconds
    Oct 14, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 14, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 14, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 14, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 14, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 14, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95594 bytes, hash a15fe8e6320a09cd4fb0d82cb0354aebe62c3f2a8d93e0464055b3724b42816e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-oV_o5jIKCc1PsNgssDVK6-YsPyqNk-BGQFWzcktCgW4.pb
    Oct 14, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 14, 2020 6:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-13_23_45_41-11795717899390997910?project=apache-beam-testing
    Oct 14, 2020 6:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-13_23_45_41-11795717899390997910
    Oct 14, 2020 6:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-13_23_45_41-11795717899390997910
    Oct 14, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-14T06:45:41.561Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 14, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:45:52.750Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 14, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:45:53.462Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 14, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:45:53.508Z: Expanding GroupByKey operations into optimizable parts.
    Oct 14, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:45:53.544Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 14, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:45:53.604Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 14, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:45:53.626Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 14, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:45:53.658Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 14, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:45:53.694Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 14, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:45:54.170Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 14, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:45:54.246Z: Starting 5 workers in us-central1-c...
    Oct 14, 2020 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:46:17.762Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 14, 2020 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:46:18.408Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 14, 2020 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:46:18.565Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 14, 2020 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:46:24.001Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 14, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:46:44.360Z: Workers have started successfully.
    Oct 14, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:46:44.471Z: Workers have started successfully.
    Oct 14, 2020 6:47:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:47:15.135Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 14, 2020 6:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:47:16.587Z: Cleaning up.
    Oct 14, 2020 6:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:47:17.182Z: Stopping worker pool...
    Oct 14, 2020 6:48:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:48:10.103Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 14, 2020 6:48:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T06:48:10.136Z: Worker pool stopped.
    Oct 14, 2020 6:48:21 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-13_23_45_41-11795717899390997910 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 615366f9-b47e-4cc6-b11e-95036e4181a9 and timestamp: 2020-10-14T06:48:21.461000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.064

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 14, 2020 6:48:21 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 53.966 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 5s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/sm5gbqll2ywcw

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1112

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1112/display/redirect?page=changes>

Changes:

[rohde.samuel] Add ability for HotKeyLogger to log a key.

[noreply] [BEAM-10861]Adds PubSub Runner API encoding to Read/Write transforms

[noreply] [BEAM-8251] Add worker_region and worker_zone options (#13072)

[noreply] [BEAM-10720] Finish implementing StringMethods (cat, repeat) (#12705)

[noreply] [BEAM-10009] Document micros_instant (#12921)


------------------------------------------
[...truncated 279.12 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 14, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 14, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 14, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 14, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 14, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 14, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 14, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 12:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 14, 2020 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 14, 2020 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 14, 2020 12:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 14, 2020 12:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 14, 2020 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 14, 2020 12:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT--wc6uWdh6D2eA1OmSvTZCtnMaHu9CbQGy7m0353I8pw.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-K7JMDKTGNGCQKJeTkLIYeVJMieHxLKbC00oqfHcO1OA.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-eMSsNQlW1anITzoVTGpOG9sCWjC5wjQHNDcB2GrO6S4.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-z-uk4lbyi2tdxxwqkdEu_uStaYJqSU4cUifhL52fqi4.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-qroIl54TE3zBRynj_-spRNBOxMOp11IqDn7lLVdwAzQ.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-k8srW7fG-vjF5XatXudoZj-SxqH_jGf-W9p63BUcEFI.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-B_Z4SuxwwsEKhfNIInKQuSdRZgAmA_pA2pstWa52vkw.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-eMSsNQlW1anITzoVTGpOG9sCWjC5wjQHNDcB2GrO6S4.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-nBqs5xpTgYbj03bTVJ5sx3Qhg7NXFXF5BFtMMymuKQ8.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-bkx0TW_ldfU96AhkKbpPGIwqrsarBaJmKWigmMGKwLY.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-1eGN2OQaD5OMW_truzaJKWilOwEzleonwdEyhfz9_t0.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4280355642073271894.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-UyqotivIoXAm6AKpLhp4tL3aerQYtdPWfT-jHLCnKqU.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-oxHVZSL4GLkvttj7o0CLlc9x1NrAujqwNY31tesRgNQ.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-lSbl3Ww4u6OIpVvmrBRt_jBNHzD0YpZWHmagLLjxkYk.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-fpOD-s03wNNAKPLWBbsb-JoR322zcjn1MCoh-nQhVoc.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-kMh4EcYfAA6JJlGgfyVl409Hl6qnkePOGcKAdaUrgD4.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-Z3TnizTUhYf5dWzUOG-iYOEJVMrp3NgYB9qKPtxDnEM.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-ylV0vnory8hGvtSwk-9iMvohksm9-0KJ-5lAKMrEsQU.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-7CSbfeAZYxCJoaEvmA6GZxfYQqh0ab-SIqxa6CRMp6s.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-GJI09Qz6s5FbdgRIM7X1Hcftig8ehd8L2kBjDYrQoeU.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-aoBdY8mjCeBs09MayNKtZcvikX2RkFXr94DTHIbdvh0.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-vAyY7khe5dgqaCsubKMKEul1yMFhL826ujPlXnLB6yc.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-MtaQuUJf1-s_5HM5L7daY_86Uomyg2fLZbarfzo7eD8.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-SCfAbeK8qxpyTehmmeeobKIMdjNQcWD3TmDEfJNwpw4.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-vhsn0RqhqgqrhJxwoP0IQfIFcw5ydsi4Es-kUg2h2Kg.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-WrMz2d2THih8zy-e92Bm59Zi9TOifUig4-0wLaJtXTw.jar
    Oct 14, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-yFxRfCPccXrpTHo4JaN9GdFY-UscQEgQQowBrApxYKc.jar
    Oct 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-d8kQS775ChjB77iJqc_PODW_tI8f1El8Ksm_DA_ygus.jar
    Oct 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-BHhYEuixmEbpml8bQkCZVbdhutuBhEmMCOduUaNj1qI.jar
    Oct 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-prmeS-R2Sv7nyDNpIJ8tZWV4KjAgL2QqiGKH43Il9mA.jar
    Oct 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-RKEJ6oYbEAMS5j9Ui2qHwQNzPUywqjJ8TmKC38HU56o.jar
    Oct 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 0 seconds
    Oct 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95594 bytes, hash 594eb07b2c00d430c7f028ac8e47b85df980e81b5645c1ecc2334a0e2e9f6bd9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-WU6weywA1DDH8Cisjke4XfmA6BtWRcHswjNKDi6fa9k.pb
    Oct 14, 2020 12:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 14, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-13_17_45_30-14624904059014595259?project=apache-beam-testing
    Oct 14, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-13_17_45_30-14624904059014595259
    Oct 14, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-13_17_45_30-14624904059014595259
    Oct 14, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-14T00:45:30.252Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 14, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:45:38.628Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 14, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:45:39.294Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 14, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:45:39.336Z: Expanding GroupByKey operations into optimizable parts.
    Oct 14, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:45:39.365Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 14, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:45:39.455Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 14, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:45:39.495Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 14, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:45:39.531Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 14, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:45:39.556Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 14, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:45:40.013Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 14, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:45:40.098Z: Starting 5 workers in us-central1-c...
    Oct 14, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:45:44.378Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 14, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:46:02.893Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 14, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:46:21.836Z: Workers have started successfully.
    Oct 14, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:46:21.869Z: Workers have started successfully.
    Oct 14, 2020 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:46:54.220Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 14, 2020 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:46:54.371Z: Cleaning up.
    Oct 14, 2020 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:46:54.503Z: Stopping worker pool...
    Oct 14, 2020 12:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:47:45.762Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 14, 2020 12:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-14T00:47:45.813Z: Worker pool stopped.
    Oct 14, 2020 12:47:51 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-13_17_45_30-14624904059014595259 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f4ea50f2-cf9b-4536-b03c-285dd3f77f21 and timestamp: 2020-10-14T00:47:51.358000000Z:
                     Metric:                    Value:
                   read_time                    14.218
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 14, 2020 12:47:51 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 17 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.003 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.008 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 35.842 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 35s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/jygldd3hsrs4i

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1111

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1111/display/redirect?page=changes>

Changes:

[Ismaël Mejía] [BEAM-10969] Upgrade Spark to version 2.4.7

[Luke Cwik] [BEAM-9309] Remove the READ urn/payload from the Java SDK harness.

[Kenneth Knowles] Enable some supported ZetaSQL functions

[noreply] [BEAM-11056] Fix warning message and rename old APIs (#13080)


------------------------------------------
[...truncated 285.10 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 13, 2020 6:46:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 13, 2020 6:46:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 6:46:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 13, 2020 6:46:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 13, 2020 6:46:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 6:46:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 13, 2020 6:46:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 13, 2020 6:46:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 13, 2020 6:46:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 6:46:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 13, 2020 6:46:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 13, 2020 6:46:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 6:46:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 13, 2020 6:46:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 13, 2020 6:46:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 13, 2020 6:46:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-G73FpjNOKjRb5V4Z4XY4GrvfRHxwXNaQdr2AphcHknc.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-KF3D066hZbFPXs4Smf-N8ldPbRlqVCjVDWS_GzDQzhg.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-FI8WR3DoTKKkEv7xsi7QJvz0KsCyoJe7TgePxYUFSL0.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tuZGnRhtWw3EcU8G7GLG22XEIqr1lgegE6h9Fgch0zQ.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-UwKdaRnyH9oicXf3tOtQw7Deu4BqydZ7gheAQtxXa7o.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-1X_vE3o74gd1vrHDwGE5n-2AegP2WOuUqwwXYuucdms.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-hj-_YRWfxqnN1CKN6TRMwLn1nsuSSlpxE0FMK9nZVD4.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-RER-rYEDXRhBauIyNTYGN96pM0KvxdRbA8Jd4qYigJ4.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-TQS2va7VlWrrtjlAJBTaMZ3qPLucrLZBzcgJXgiCyck.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-6CbzDKA8W6ccl-q_cI6_NDBBM9bzgUDIHTpwY0yuW_Y.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-xdHz0XFic50sT6xjbtFZmKew71MtZktV0urIDZ7yZmE.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT--UfHw7Jq0DJyKR97X1C17HN4qICvFWDIunpe8n_rLqI.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-z30nZFFek5Ip_MRkOWJ_70NTujsWdReE-hsoAF865fs.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-GUiVCXhtL1k9GAR6jt3AaNJ78Zt50NC0tz8ZuElWoa4.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-CSYXQ2TpcbMfDD5WhdDfulPeooaP3Hnr3ooggXkhtUo.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-Jm5fAaoIHPTnfwqeOf7S3JT3krOX7egRsrsdwgFMzrw.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-NvrwIQdwANlhquJdODx6dA8Xlx4lFvhccOyc9UBypdg.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-psooM0YX_H95HDfN9mi4kdvjblIyXzmKJUBx50R87oA.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-ieYiMGyW-KfeERXVmFMDYWtko5iJMN3QQv38ClevTNI.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-HPAZ3SbJcwGBreAdXl2Rpk06F9nwP-pNdYjYvDZslEs.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4696456152078727281.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-LRjYKL-ufkSgz71tDQSY4HCRBPf_iDQ2SR80VhmECdk.jar
    Oct 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-LCvGfWamFU17CloFm3L8JbPYDtglxe-jPIAUzEVxWY8.jar
    Oct 13, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-8xsK2BNrTgWeLsQievB1RobqaaphgxyYeHjYBRLiooE.jar
    Oct 13, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-1wULemJaF-UslG_hUuhH-wC8n0x4A8WeuPo8JwuzRWg.jar
    Oct 13, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-SSgAmuWMcEzLxmEYv0rPmwyBu-o4sRO86rT2Crk85V4.jar
    Oct 13, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-itCnIDBksKbnCnGIZglCgVZ5c5f1yv5e_JwXyliL0fY.jar
    Oct 13, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-G73FpjNOKjRb5V4Z4XY4GrvfRHxwXNaQdr2AphcHknc.jar
    Oct 13, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-RoJTpuKci-BY0VbR70Y2YUizgUz7DHXTCsIrwZqVENw.jar
    Oct 13, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-nLLFd1tioGUPDeXFXNdt2tdOWYR52PIr7hLr84-ZM4U.jar
    Oct 13, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-FOvx_POyBDR03xDJr85AWy6QVaidN2ke7UF9foUgl78.jar
    Oct 13, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-RR2MPfDqXZLiY4gO7jkK30tIy6WcxELK81tnZa2skN4.jar
    Oct 13, 2020 6:46:29 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 1 seconds
    Oct 13, 2020 6:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 13, 2020 6:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 13, 2020 6:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 13, 2020 6:46:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 13, 2020 6:46:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 13, 2020 6:46:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95588 bytes, hash b3739d8a71bd6a734a26e8c79ae9a3c9c65a8b5d453da02ad0c8aec7e9641f1c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-s3OdinG9anNKJujHmumjycZai11FPaAq0Miux-lkHxw.pb
    Oct 13, 2020 6:46:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 13, 2020 6:46:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-13_11_46_29-5157491514330082848?project=apache-beam-testing
    Oct 13, 2020 6:46:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-13_11_46_29-5157491514330082848
    Oct 13, 2020 6:46:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-13_11_46_29-5157491514330082848
    Oct 13, 2020 6:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-13T18:46:29.952Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 13, 2020 6:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:46:40.360Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 13, 2020 6:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:46:41.109Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 13, 2020 6:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:46:41.167Z: Expanding GroupByKey operations into optimizable parts.
    Oct 13, 2020 6:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:46:41.200Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 13, 2020 6:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:46:41.286Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 13, 2020 6:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:46:41.314Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 13, 2020 6:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:46:41.393Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 13, 2020 6:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:46:41.453Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 13, 2020 6:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:46:41.864Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 13, 2020 6:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:46:41.949Z: Starting 5 workers in us-central1-c...
    Oct 13, 2020 6:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:46:49.560Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 13, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:47:09.983Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 13, 2020 6:47:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:47:38.133Z: Workers have started successfully.
    Oct 13, 2020 6:47:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:47:38.172Z: Workers have started successfully.
    Oct 13, 2020 6:48:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:48:10.422Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 13, 2020 6:48:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:48:10.573Z: Cleaning up.
    Oct 13, 2020 6:48:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:48:10.666Z: Stopping worker pool...
    Oct 13, 2020 6:49:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:49:01.602Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 13, 2020 6:49:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T18:49:01.651Z: Worker pool stopped.
    Oct 13, 2020 6:49:08 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-13_11_46_29-5157491514330082848 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1172fd9e-813c-48f4-8e17-52d880e41319 and timestamp: 2020-10-13T18:49:08.559000000Z:
                     Metric:                    Value:
                   read_time                    12.193
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 13, 2020 6:49:09 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 5 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 53.93 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 52s
107 actionable tasks: 79 executed, 28 from cache

Publishing build scan...
https://gradle.com/s/5a3v56m2c26dy

Stopped 4 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1110

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1110/display/redirect?page=changes>

Changes:

[noreply] Add Relative Path Sensitivity (#12955)


------------------------------------------
[...truncated 330.23 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 13, 2020 12:47:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 13, 2020 12:47:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-Mrch3bbKcE-y8wFjiE9uiiDaYXxI4lCdVJp9uihQBms.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-r_EztzH7SXjM2k4SLwmIkyh2hth9kFYvStJELRmGeVY.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-A9W8ncqKfSHE7HzmT76cw9nnjJzGN1C7kYpz2d28QTg.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-xnsDy2dVOKDW7lRtGZ8vafSaRI5P9cPaMcH9MWeNywo.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-VDWcGUT6OPMrEQWtsABRuklFqlNPa10ultq8qg23l68.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-Ej1qytiB-suRTSOyF_dmefws8IfldCzaxygpCrCVU6o.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-U8YYnOiQWL3WfY7HNprgJ2LM4GdZDnSiuoD1ZfCfiqc.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded--YgldIVqiouIyRrteEv83NcE6mSVj1-KF46-Dpuqax4.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-e1OLvNLg6G2D89IrgpPnzOo46fpZJVDKQvoUBSRPg0w.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-yiDV10jFHF2SJ5ggbV4s3KAeeVM6kR2Dng38fNtBADs.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-6RJikLSla3bSgt-s2DvvqMzXNJKGAPndIdlBQjoR-ag.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-F6rv5qZdPHXri0Y6hh82pjmJnr9AyFY4i_TnEuJ8VR8.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-joRARPLLzZhK4Sj-i9SHSG8x07erAKNfyz7U88KZvGk.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-P4ePb09WrARKe9dsbOzSu6yVR49HvAnEstbgKDKYYqE.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-fqraNc_10Jr2MxCVGyUV1XWuM8DG9D2WXrVc59_ClLg.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-fqraNc_10Jr2MxCVGyUV1XWuM8DG9D2WXrVc59_ClLg.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-zeOeKvNr0dPm41J-e33-qZRLdoEIpYEs0vIsyf8YKa4.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-EcZV4JpjhAWKA7jSOdC4F8Pxwe4sHPI0LYYGemW2KCU.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-5A_1O45cLu0YR0r0uQPVtiTkIw1COGm-lNP9cg5AI4A.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9054824255728691138.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-0mr6W0BIeFCU6L8AtJqBz3EpT8wSQ3uBUS9HijL1k30.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-6x53d-tnFAEu24fqc7E0vOo7wVRIEgKh1437S1xiJzE.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-nM6YGgy75F37LKkzEWqQ2h7pXfd9Vsg8mhYmuUTIljc.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-xm6lNFiR7Kc8QObFigXAVGb_hF-mH7WXT6GreID0fYA.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-ElIzR4PVvJfe6rU904tTmNHS7DhZqAcqaSucwEMet3I.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-w4Mc5uBSzdTP-znARlgZGPLvbHSXme4AgIblCtiNmlI.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-NQdP3ZIqVXtfNiOdVzuT90wglY-YzUT-O7dN2hFFS34.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-yVn8UUgmQupXHt7vRhYOaNuRu5UAr6XD3QJu2r2sCy0.jar
    Oct 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-up0JxmoByU8uUWIMJHHZQTZ7lEnyEQ09pEY-2IBio_U.jar
    Oct 13, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-FVSHGK9QkYCl196jdyOJ5DoQscgm3jEJO2xjsBBOlXw.jar
    Oct 13, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-_9hzv41dNUxu4jMGDgDyrdOMmMmjCH0cg2VjPlBPztg.jar
    Oct 13, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-ZBGRBxVvvE9Klz_Pg5-vIk-TXRep_TZYw2OcO4rTUUM.jar
    Oct 13, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 1 seconds
    Oct 13, 2020 12:47:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 13, 2020 12:47:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 13, 2020 12:47:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 13, 2020 12:47:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 13, 2020 12:47:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 13, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95588 bytes, hash 9c651cd2b4c82c49861d5125816a666744ede53e7a1a8d32e3d2d7c90398f08b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-nGUc0rTILEmGHVElgWpmZ0Tt5T56Go0y49LXyQOY8Is.pb
    Oct 13, 2020 12:47:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 13, 2020 12:47:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-13_05_47_45-8746422049139772137?project=apache-beam-testing
    Oct 13, 2020 12:47:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-13_05_47_45-8746422049139772137
    Oct 13, 2020 12:47:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-13_05_47_45-8746422049139772137
    Oct 13, 2020 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-13T12:47:45.120Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 13, 2020 12:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:47:53.882Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 13, 2020 12:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:47:54.621Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 13, 2020 12:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:47:54.666Z: Expanding GroupByKey operations into optimizable parts.
    Oct 13, 2020 12:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:47:54.698Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 13, 2020 12:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:47:54.768Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 13, 2020 12:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:47:54.798Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 13, 2020 12:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:47:54.842Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 13, 2020 12:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:47:54.875Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 13, 2020 12:47:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:47:55.336Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 13, 2020 12:47:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:47:55.432Z: Starting 5 workers in us-central1-c...
    Oct 13, 2020 12:48:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:48:06.285Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 13, 2020 12:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:48:19.748Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 13, 2020 12:48:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:48:41.921Z: Workers have started successfully.
    Oct 13, 2020 12:48:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:48:41.958Z: Workers have started successfully.
    Oct 13, 2020 12:49:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:49:20.097Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 13, 2020 12:49:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:49:20.376Z: Cleaning up.
    Oct 13, 2020 12:49:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:49:20.534Z: Stopping worker pool...
    Oct 13, 2020 12:50:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:50:03.407Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 13, 2020 12:50:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T12:50:03.494Z: Worker pool stopped.
    Oct 13, 2020 12:50:09 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-13_05_47_45-8746422049139772137 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c1bebd76-aeb6-402a-af30-8a2513928db0 and timestamp: 2020-10-13T12:50:09.817000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.062

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 13, 2020 12:50:10 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 12 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.04 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.059 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 40.339 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 54s
107 actionable tasks: 106 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/cd3cpxdgcj7ie

Stopped 11 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1109

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1109/display/redirect>

Changes:


------------------------------------------
[...truncated 274.72 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 13, 2020 6:45:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 13, 2020 6:45:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 6:45:49 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 13, 2020 6:45:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 13, 2020 6:45:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 6:45:49 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 13, 2020 6:45:49 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 13, 2020 6:45:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 13, 2020 6:45:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 6:45:49 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 13, 2020 6:45:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 13, 2020 6:45:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 6:45:49 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 13, 2020 6:45:50 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 13, 2020 6:45:50 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 13, 2020 6:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-nFNlg9EVCJjbkAQDx7yhucGpFgbMzBvCQJWzBAIviJQ.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-ZZYeKWeVDTUd5izXZbAdomyKFYD-83X5PlMwgLIo7gE.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-nFNlg9EVCJjbkAQDx7yhucGpFgbMzBvCQJWzBAIviJQ.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-PXyBhCFdeZL656cEpSweAX-OhdPGiNw9Mod9c1yERV8.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-HWTC_4j4sxZlGSmTY64CeXedKp6mmndp427nE3AsKhg.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8625710991218304514.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SIiCrPHPQpwDC9LfDxkmiXx65DvwIkH4ZFPbwnHzpMM.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-U1tT2eHN8_zUa9G9uvQ0Wy9ULETFB2U3wPymdEpTtmU.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-xwkhczArXw4m8pgqMEyjFN6bn7wOabeMvAwtbqTd6rw.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-PsiDOa3Fxqf7Q8gyjWnQW-6ZlT4cWTiQexINmvOx4k4.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-FQWnJCrl4ElufKK1rAACo-Cs99RgsBuK3YPLG1osJeI.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-0-aaEKOmtsy1lbPOcbUB8JOzPz-Qp9bOH0_igHfC4TQ.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-LiX_rNz-cI9tnYEGYbTYrBqbLE41yNXSPGTfKC6Br_s.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-1_7VUOX05tSdYlQzWFnSes1X9ALB7Mkqb4WQiZBziC0.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-7IjLkTAxtH7AdtzssaPQCie9Cv883UL4JpZyx0as5R8.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-1nQsstlXqYpX5jW6VtMlrMZwykc0YuxPtnU6yHOYNGo.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-y1sfVpLPjHtc6fgQQdtKU4UDQGnWdZOSuzblYm6D9uY.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-3AJWDkhIQJ8D6xjDbOZwZO3KIbhWXMh3vGAP_y6sdMI.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-W7EGUC5gYPSwvgToTy97UGWGThJFJmieDa_nb25EzG0.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-NshHEIw2br2Z7IF4mVFZEYdfjnYXMNsj7e7fgcNP-hY.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-nJ-BMB0aEfSQQew1cvlKDJ3dmmSmeJlB_69lBinXJQg.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-A6e07f87gzsrJy0QcO792j_LvYFBSOtJu-aWkeBsxdw.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-6iP02CCOpXxq2xOaDB1_6A6kuhVeOjN1mXMpwx-8nyI.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-TSrNeP3DppUypViO0ZSVgFSJRFFh3L6gpsEyESL5O-A.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-BtI8Z7A55SC6QX3ueTrA4bcxXiZ7iC1h_q1vNDnC410.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-Gc30_0lHNRjygMt9a4piYJikvFHcvU-CT5UCPvnTm2E.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-Q-Kp8yD6cLwNzmZfE4RFYYKDFKjSe--rsy4IcIjLweQ.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded--JTcF5onFj9nX-kXh4Ambiu0U-UdSk9f__4PZhe1cRA.jar
    Oct 13, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-PXCii6KkBHIUv6ulsOk1NBRyVvdRYRnYphDWyrq3biA.jar
    Oct 13, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-GuXdjvgAt5GDX22no2yW50a4oqFVAdMzHDD8xKHzLyI.jar
    Oct 13, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-qyiMSnJGoxSrz1kWrUwHiOD3fbENNcHPZDne-owmW8A.jar
    Oct 13, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-ilHGUOq2JC_jxJCoBD26zUVhz7jDX-uBhpr1Vyp1uy0.jar
    Oct 13, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 30 files newly uploaded in 1 seconds
    Oct 13, 2020 6:45:55 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 13, 2020 6:45:55 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 13, 2020 6:45:55 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 13, 2020 6:45:55 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 13, 2020 6:45:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 13, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95587 bytes, hash 06fb7cf33ae110c1ea5d99dba089ab8d816b293fa4c7be67c48fd4e9f435cbb2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Bvt88zrhEMHqXZnboImrjYFrKT-kx75nxI_U6fQ1y7I.pb
    Oct 13, 2020 6:45:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 13, 2020 6:45:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-12_23_45_56-7064660707752045496?project=apache-beam-testing
    Oct 13, 2020 6:45:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-12_23_45_56-7064660707752045496
    Oct 13, 2020 6:45:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-12_23_45_56-7064660707752045496
    Oct 13, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-13T06:45:56.138Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 13, 2020 6:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:04.180Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 13, 2020 6:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:04.829Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 13, 2020 6:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:04.870Z: Expanding GroupByKey operations into optimizable parts.
    Oct 13, 2020 6:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:04.896Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 13, 2020 6:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:04.967Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 13, 2020 6:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:04.986Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 13, 2020 6:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:05.013Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 13, 2020 6:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:05.038Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 13, 2020 6:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:05.700Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 13, 2020 6:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:05.767Z: Starting 5 workers in us-central1-c...
    Oct 13, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:14.307Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 13, 2020 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:31.946Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 13, 2020 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:31.972Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 13, 2020 6:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:37.248Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 13, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:51.269Z: Workers have started successfully.
    Oct 13, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:46:51.302Z: Workers have started successfully.
    Oct 13, 2020 6:47:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:47:28.813Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 13, 2020 6:47:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:47:28.953Z: Cleaning up.
    Oct 13, 2020 6:47:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:47:29.029Z: Stopping worker pool...
    Oct 13, 2020 6:48:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:48:16.192Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 13, 2020 6:48:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T06:48:16.235Z: Worker pool stopped.
    Oct 13, 2020 6:48:21 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-12_23_45_56-7064660707752045496 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 28e72ee0-a2bf-46be-bf77-fdb0eeaa4757 and timestamp: 2020-10-13T06:48:22.036000000Z:
                     Metric:                    Value:
                   read_time                    17.072
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 13, 2020 6:48:22 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 44.059 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 6s
107 actionable tasks: 70 executed, 37 from cache

Publishing build scan...
https://gradle.com/s/nrpbzy6crohim

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1108

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1108/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Cap number of partitions from above, not below.

[Kenneth Knowles] Tighten visibility of DirectRunner classes now that Java ReferenceRunner

[Kenneth Knowles] Remove DirectRunner dependency on deprecated CreatePCollectionView

[Boyuan Zhang] Add unsupported BundleFinalizationHandler to portable batch Flink.

[ningk] [BEAM-11045] Fix screendiff integration tests

[Luke Cwik] [BEAM-11050] Duplicate accumulator if it appears in multiple windows.

[pawel.pasterz] Update dependency

[pawel.pasterz] Update dependency

[pawel.pasterz] Linting

[Steve Niemitz] [BEAM-11057] Make BigQueryIO.Read.withTestServices methods public

[Robert Bradshaw] Introduce intermediate superclass for both Dataframe and Series.


------------------------------------------
[...truncated 275.97 KB...]
    Oct 13, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 13, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 13, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 13, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 13, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 13, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 13, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 13, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 13, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 13, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 13, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-i6PoH0jGsnT6-IZR2Ih0tgJxIPoDYHJigDNUiiPsigc.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-LCxbad2JJxYvp_mWswhxQLJY4hG5MDevs6RY40oewxQ.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-e1ZHmjpVYhSu8F2rBNSNB1v9a2dvuuhVmPQ0qRCxSrw.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-cZoo_DCFxBqYvETTJ9nvekOF-LRUVS72fXFaGoH_fSM.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-SDnjedhMKKc2Hi7wvzDi2iuWAlAACSlQti-eNYgordM.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-Xig_QhyUgtLWjy7isMLU4J7mf20w6sq_Iddxq0mgfpQ.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-xq5ya-kHAr6UI0AnbA45wkkWxlm-hMveZdW9W_ytu5E.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Q6mAG0rM2omLQfLJcNb_PSlo3B8fBvsBhM29hVWmuug.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-5gMf9pyzt8uLXWLvWK_MfS9C51VKC90MP3m5_XLDqys.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-c4a7tGdIJuJYEnPEYmIxaLUpcKxMbOCw8PubHL0BEdU.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-kzraooP_r87N-MvByMZ9LcMQva5BiS3MYeOiRwuKxUU.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-i6PoH0jGsnT6-IZR2Ih0tgJxIPoDYHJigDNUiiPsigc.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-F-A3hgR63tC3Vu7QHM9iXFloVLDx2x7k8aa41XVA4ks.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-QAZjtQoluTjCl0_m8Iy1lqteRCiwWSLgxHp9nXTf0r8.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-439sLr_jrQeOTMQtaviZWat19ompGkOGRi1blMJaYEQ.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-cWCdmtD9mn4HFg2t1FrqsEF5mA--SwwwYI9hVpa_rbE.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-u0G6HKy9zqZVmzXw_wyHSGD8kfiThg13dsKN7PW_pHM.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-aufMWpLFKRM7bbBXi4Esj9yYOtK0w6n6TDMyj-O5gXY.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-6QPt1I9HWP9DmUrczxbgg5g9Q_SUZ1JWJWUs_kmesx4.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-jm0FnHBQ8B-BPocMOl-HYGG0yybo-0rcm6mz9HoFsBI.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5502693361987670671.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ra8d6TEgGKd6pbw3LJomzOBYzdDS49eBVhnTF8jErMc.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-6QWho5X2n_ObbhTp4nNJSFmBJVuzgm9UvVdYl7srlWo.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-yWEFmr-0TJ5oEcob6AB-dGEMixlfJLX8sBqfqotF7TQ.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-SgIpOhxzPXPWveu_hqQJ_9hhxmF5FcdxyspVG_AUk5U.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-jgC2FUuAa8YskebAp32LjDVkthfZ6y-iPkqsRRXjLlI.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-B6gbn7TjU0HSCJYyJN7-TVShIHSRdfTJF8KZRTXqte8.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-GZku_FqjR81GiUyHT_YMwPHghlBDoiw6dbfnm7kI6U0.jar
    Oct 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-rywXKpLBQLRTPAqSNnKpI_DW1NEbdBLct3ezWqia-gM.jar
    Oct 13, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-RS6K2ZtBj52i4DoXfMhyVGAdIkUgnevB7CFiGf2yvho.jar
    Oct 13, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-h6ascj3OKGjar7KBqvVZZr9ZaLN1CzwEwfOJ7Qlk1U4.jar
    Oct 13, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-xOqVYrLEV4Ep32zQGc-L0rcDyh_vAl7W53ZyX06_fg0.jar
    Oct 13, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.mongodb/mongo-java-driver/3.12.7/1f45c6a397feeb46da75425619333d1cc6f90f78/mongo-java-driver-3.12.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/mongo-java-driver-3.12.7-D_zgBJWNb9mzmuetJ37a0X9XtpcfSGsXYpxe6eE8Tao.jar
    Oct 13, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.68/9e3d29f05bcfab1c15a1357ebf2dd513c1d42f49/fastjson-1.2.68.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.68-cGrbCezeeBQfDPJGWh6b307ug_n5g8_BYqWhckhy_rs.jar
    Oct 13, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 186 files cached, 32 files newly uploaded in 1 seconds
    Oct 13, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 13, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 13, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 13, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 13, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 13, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95588 bytes, hash 6bf980bfc2827f3cab4da666d2c77244f7254a15946b8821c2df04db50c354f7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-a_mAv8KCfzyrTaZm0sdyRPclShWUa4ghwt8E21DDVPc.pb
    Oct 13, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 13, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-12_17_45_36-15196112752773302239?project=apache-beam-testing
    Oct 13, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-12_17_45_36-15196112752773302239
    Oct 13, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-12_17_45_36-15196112752773302239
    Oct 13, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-13T00:45:36.152Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 13, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:45:43.311Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 13, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:45:44.122Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 13, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:45:44.155Z: Expanding GroupByKey operations into optimizable parts.
    Oct 13, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:45:44.180Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 13, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:45:44.239Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 13, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:45:44.272Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 13, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:45:44.308Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 13, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:45:44.333Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 13, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:45:44.654Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 13, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:45:44.715Z: Starting 5 workers in us-central1-c...
    Oct 13, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:46:08.742Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 13, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:46:08.779Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 13, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:46:10.845Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 13, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:46:14.151Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 13, 2020 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:46:30.437Z: Workers have started successfully.
    Oct 13, 2020 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:46:30.460Z: Workers have started successfully.
    Oct 13, 2020 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:47:04.125Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 13, 2020 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:47:04.270Z: Cleaning up.
    Oct 13, 2020 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:47:04.341Z: Stopping worker pool...
    Oct 13, 2020 12:47:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:47:57.957Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 13, 2020 12:47:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-13T00:47:58.021Z: Worker pool stopped.
    Oct 13, 2020 12:48:05 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-12_17_45_36-15196112752773302239 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 446610a1-44cc-4c2c-a14b-c493578cf82b and timestamp: 2020-10-13T00:48:05.365000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.336

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 13, 2020 12:48:05 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 44.624 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 50s
107 actionable tasks: 72 executed, 35 from cache

Publishing build scan...
https://gradle.com/s/dwjdq23l7d6w2

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1107

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1107/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Remove stability history from Dataflow streaming VR job

[tysonjh] [BEAM-11053] Set projectId in BigQuery service.

[Robert Bradshaw] [BEAM-9547] Implement covariance and correlation.

[Robert Bradshaw] [BEAM-9547] Dataframe covariance and correlation.

[Robert Bradshaw] Scalar apply method.

[noreply] Revert "Delete unneeded PCollections in pipeline_from_stages() (#13014)"

[noreply] [BEAM-9196] Update testcontainers to 1.15.0-rc2 (#13071)


------------------------------------------
[...truncated 283.92 KB...]
    Oct 12, 2020 6:46:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 12, 2020 6:46:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 6:46:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 12, 2020 6:46:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 12, 2020 6:46:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 6:46:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 12, 2020 6:46:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 12, 2020 6:46:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 12, 2020 6:46:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 6:46:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 12, 2020 6:46:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 12, 2020 6:46:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 6:46:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 12, 2020 6:46:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 12, 2020 6:46:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 12, 2020 6:46:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-LSReZT-JMg52kvrxU_ckmENafgNrftkgVDhCPIPNUlQ.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-H8MB4XAfQjrLuxIbmoI0v_ir11DUvOQGKqPcVBbr2Hc.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-eGng9TZXELCmPG_vL2UEA5YI72m5wRtmvpe3Y5iFHAY.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-MTDffi3qwa2wIODWy-KrlvT0TbdoSU87GhiCIlSuw_I.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-MOvGQgtRKzVlBvL_uc74uMEC4vhVQeOqqiDnSDoj0uk.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-jdukfEuL94HpOLv1xqif2cMEx9TkaPpMt1J7arskxrI.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-a1cAr7-rBLSZUGxrp1MqvXS5i84DdMfYTfhKj5xhcQU.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-OPLwEJdfF-6YjngUwAku8C30NIWe2ciCzqWEa6REGUg.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-U_aL5a9pcLxwBxCREhg6dHkIraZJd5OxEJ7qw-KLY7I.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-njQtmJLLmdYZhv3uoLi7KXXlNSb-Rrw3F_Y4L2-dAAY.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-TNCJEQblN3YsLxnBMEUhEQQqIzUR9cO1yPYzp2NOiGE.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-ANDbZv2GzjF_I9o8TCQ23druegkHks0BKO5qj73WB1g.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-LSReZT-JMg52kvrxU_ckmENafgNrftkgVDhCPIPNUlQ.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test806910953215402280.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-l39DZKFaMXok8H5AI8yqkt9GRNNWhxy_gJo_L8lzCDA.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-FUE1Nm-OUIoXNgzoOrvYLvxM01pT1beKlDAgrRtsOiY.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-IsljYp2n2Dz3uFoB22Tmqo9b6x0p5cAsTAb3QErKYp0.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-20UqyFC7UmBpcWPRsCq_BvbqdayW2C3m0uw1u8Zuqg4.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-Fdqa6m7qqhF4Jdtm7wUQTHC1HNNbO4a8JoCCAwjzYx0.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-CoXDHlTT2sFJc9u_cYpVBWaJYHp7Z4eLY-OlVkK84l0.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-aotdSqCk3YO7KNeeVC5N2ttkbCObJPZKn1D6ecijv2k.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-5Gu-FQkktau4pg8U2cWU5jb4z96gVToOWGaPoofFbtE.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-dQGbcOp3hrcCjfCjoKrbY-KXuMUgIBU4N-C7mlxjEiA.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-hzg2a8fwOPDq07VJs6CctGhWwxoVGoc6XxVdG3fWdxU.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-7gNBqPjJfhCBoes2vWfcqSaPJehRU8Buv03aU7WnLlQ.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-vBRbBaAFCdhYC3mWcu9IgL7PyK8z7dS3NyIb54qnDfQ.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-4S5BlfglYdB9Juo42uazKwAAuvdH4fwgJG07NNeEQ9o.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-61zcw7vr-Iup1fFHwQKL0Th3N_-doUpQ7TEIzyGSNAc.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-LMJBk0IzcVMaWeUqPVceB4FiKWYdr6QewTJuKuRObgw.jar
    Oct 12, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-k5ZACF42gO-t5uvhMpQmhQoR83O5fOmdQ5Ve0MZ5kuY.jar
    Oct 12, 2020 6:46:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-mdoux0eLji1lzAZ_MuFGkI9CDplndYphtRhw0oCtvIs.jar
    Oct 12, 2020 6:46:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-H5NmBG16RqxFCedRl_CacPPUA7baod6-QmOEoAAPUnA.jar
    Oct 12, 2020 6:46:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-0ZYdtog1AQYvohIrZvDd80q82U8gVaL0OvqtSE9v78g.jar
    Oct 12, 2020 6:46:47 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 187 files cached, 31 files newly uploaded in 1 seconds
    Oct 12, 2020 6:46:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 12, 2020 6:46:47 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 12, 2020 6:46:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 12, 2020 6:46:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 12, 2020 6:46:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 12, 2020 6:46:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95584 bytes, hash 791f7e5214f96e01c0e1e3eaf1ca51b5c8cb51f8059b0d25464813b10a1fc64e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-eR9-UhT5bgHA4ePq8cpRtcjLUfgFmw0lRkgTsQofxk4.pb
    Oct 12, 2020 6:46:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 12, 2020 6:46:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-12_11_46_48-17504180318085126351?project=apache-beam-testing
    Oct 12, 2020 6:46:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-12_11_46_48-17504180318085126351
    Oct 12, 2020 6:46:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-12_11_46_48-17504180318085126351
    Oct 12, 2020 6:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-12T18:46:48.386Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 12, 2020 6:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:46:56.453Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 12, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:46:57.305Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 12, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:46:57.968Z: Expanding GroupByKey operations into optimizable parts.
    Oct 12, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:46:57.998Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 12, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:46:58.059Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 12, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:46:58.085Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 12, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:46:58.116Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 12, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:46:58.152Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 12, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:46:58.814Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 12, 2020 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:46:58.908Z: Starting 5 workers in us-central1-c...
    Oct 12, 2020 6:47:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:47:20.123Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 12, 2020 6:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:47:23.424Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 12, 2020 6:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:47:23.466Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 12, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:47:28.827Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 12, 2020 6:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:47:47.299Z: Workers have started successfully.
    Oct 12, 2020 6:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:47:47.350Z: Workers have started successfully.
    Oct 12, 2020 6:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:48:21.015Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 12, 2020 6:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:48:21.222Z: Cleaning up.
    Oct 12, 2020 6:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:48:21.307Z: Stopping worker pool...
    Oct 12, 2020 6:49:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:49:11.221Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 12, 2020 6:49:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T18:49:11.302Z: Worker pool stopped.
    Oct 12, 2020 6:49:17 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-12_11_46_48-17504180318085126351 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7b0e6aa9-dda1-4387-8252-3bc9f4f5edbc and timestamp: 2020-10-12T18:49:17.868000000Z:
                     Metric:                    Value:
                   read_time                    13.076
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 12, 2020 6:49:18 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 45.913 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 54s
107 actionable tasks: 75 executed, 32 from cache

Publishing build scan...
https://gradle.com/s/dksswp35bkadg

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1106

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1106/display/redirect>

Changes:


------------------------------------------
[...truncated 272.00 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 12, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 12, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 12, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 12, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 12, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 12, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 12, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 12, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 12, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 12, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 12, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 12, 2020 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-2MhP3IRO9hung-9y6D06Tj7o9QcCr1MKvy0oJ9Qt9Kc.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-U-VjgFEZfMJVhelSJqhPmFVoY6nGPNHs59puqAD4XLs.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-MAA5GaF-GJFK0qXonKS7ozJUmM7k6GM7ZXJKrgqo76Y.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8547894783050613553.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-UVq1WL6xWurcbMO_peTOrlto2I8Qd4h8dIjv_0btIEY.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-2Scqk4vLHr1lwnsy2w1ctwA16Y0kQV17f5EPpD2yZAE.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-QSBWz_PunMBLOMIybMLWgvtFb1_KiAHmhhSZj9WPdSA.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-XYF2yoAo-oumsEy8OBxlechYJZKsweVFiPuKYgSaeFw.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-JBcKvmFD1NS3QpJiHCIOV2jRDsKjXVcStbYUmnvXcCU.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-Uv7ucfLuq6_H03BzoQ0SDXsK7NcQrklNDzdJAgyHTmM.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-JBcKvmFD1NS3QpJiHCIOV2jRDsKjXVcStbYUmnvXcCU.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-7hvcNlYZckUcYWpDDnjSUTWBGGW74-uECL0pgefLR58.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-pOEu6fwJxzcAM_Hv8dyLfcGb57L8ZM71SQt30b9OUE4.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-agOjEhPqBKENKhbcBCPeHoy4HKiUi0r96_yBdtynUYc.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-4E8-rZdGBYfyBJNsQLv4kZnQAc0bfKnYL9PyTDKR1kg.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-YNTzv2j0dvrQVIRo898xzw5BNAkOrHNgnSF2azZxM5Q.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-j03pmyCJbr9l45TXJTcX_obvnl6d4747FJGv-sIWQNI.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-pSZKDu93wwvQQTVxkJUHOucatPe_Wq6vYDGGl2HKMGY.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-nPl-iAZF2-sBxcxMh8X8zPGOjIF1u8TA9NVjlsCCuGY.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-0oYGuC80abMO5qjQSAWPvkD09oPlX2w3OFo5OJppHzQ.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-c26c3hnk7m2JOp3kTkuE3yJbcSaKeZ8qMEo5S0IXZl4.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-rFBENwUe6-vbiyWAFn5cgSOXqdSn71dXCjTrwufREvk.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-FCr6F9Ecr7gO0hV9B_PvSbagSYiJfhMPyvHKbK2zEko.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-emmicRYG7rgM6s0xhy-0Quf-3hInyS9ZFRtTHVYDsNw.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-2BnVQJbgC267dzLgj3bA5iWpVQHx_6QBPql1IzPlgvs.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-vSYr8RFxiPg5LlFpn7_cPvgmQ1DQ5VZzKDHY5eKY1kY.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-wmAtcPxHl1t8Ss6ZN387JatKfbppKCyka7eVMLecynI.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-sRYaqNLApiy0OMLjnF6rmodZ8ib401p_9Px9PM0qBec.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-gjpPwakZXkz-4sRtUJJKchH5HvRebfFDMbehQ1JYZ98.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-4gR86wt88f6xu8U752XZHxzo1jqWhJrFObjHRVYX9i0.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-inzmbtDRqvjDCZtsG0Chl-gqek5hAHVci98rU1SiQ80.jar
    Oct 12, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-rMmtpcjTw6c7YgRSRPo6CvJkPlffRuzAOJIfrCmH2cY.jar
    Oct 12, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 12, 2020 12:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 12, 2020 12:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 12, 2020 12:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 12, 2020 12:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 12, 2020 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 12, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96274 bytes, hash af999abe86d9834804c95e9787ad7e96175c25ffd07e8026ad96a06f9ce590b1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-r5mavobZg0gEyV6Xh61-lhdcJf_QfoAmrZagb5zlkLE.pb
    Oct 12, 2020 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 12, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-12_05_45_14-3290249035817748441?project=apache-beam-testing
    Oct 12, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-12_05_45_14-3290249035817748441
    Oct 12, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-12_05_45_14-3290249035817748441
    Oct 12, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-12T12:45:14.905Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 12, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:45:30.963Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 12, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:45:31.524Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 12, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:45:31.568Z: Expanding GroupByKey operations into optimizable parts.
    Oct 12, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:45:31.864Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 12, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:45:31.952Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 12, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:45:32.001Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 12, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:45:32.035Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 12, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:45:32.070Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 12, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:45:32.468Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 12, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:45:32.548Z: Starting 5 workers in us-central1-c...
    Oct 12, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:45:39.415Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 12, 2020 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:45:55.370Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 12, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:46:17.218Z: Workers have started successfully.
    Oct 12, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:46:17.252Z: Workers have started successfully.
    Oct 12, 2020 12:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:46:58.199Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 12, 2020 12:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:46:58.333Z: Cleaning up.
    Oct 12, 2020 12:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:46:58.411Z: Stopping worker pool...
    Oct 12, 2020 12:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:47:48.650Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 12, 2020 12:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T12:47:48.710Z: Worker pool stopped.
    Oct 12, 2020 12:47:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-12_05_45_14-3290249035817748441 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): fbbcba13-f05f-4a7e-8bb8-24390bb9fee3 and timestamp: 2020-10-12T12:47:55.450000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    22.055

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 12, 2020 12:47:55 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 54.213 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 40s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/ubxiyd7r7lthe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1105

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1105/display/redirect>

Changes:


------------------------------------------
[...truncated 272.69 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 12, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 12, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 12, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 12, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 12, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 12, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 12, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 12, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 12, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 12, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 12, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 12, 2020 6:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-YzTXXGHCa1lC3M_8an9OGibsrIwgNjgacMbtm_sAA6A.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-hjeeuvlCyGF_272--X-momUXQOTXh3rfzDQmugViksU.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-yhEM3xMMwA0P3ioCOo9IGZXleWZG838KXBm013Ep87U.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-s3x8gwNsr15xu2vSeBLOAjzvZLNXEbMWWn34hYzteWI.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-8i_plqxD8C8bHj-Mrs4M_9RzAcHH4RtBOxFtNF1wRqY.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-vZ6pH8FQFYlCYgi0DTpfyvF6jaNrHHFbWo8ttK5vRCc.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test322832929526520816.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-zpB6Di651oCvAKq1U9nE32PRCYzIqhPIxvLGxdTIzms.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-uSl2re11skobh8Hh3NNe7bujjWJuzLgzR6ZXGzaKzQ8.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-EHaQNRQNvalOZhr2kEietd3YzotmtOwdAXD9LjNY9Bc.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-U0Wolc6UuZJFuCWAD8Y814PkFhpRqHnoI8Cqg4T-6Lg.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-vGfMcDTOsmNgcUB5ZlUbSfs5IWk8hdVCnisT2uW2T5w.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-r5WH9XH-UIGWr1Go8ITU0cJ6In6HdlRd6CNV82cCECA.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-_VIqrf8CcKUouuap84wnbQPwW8uzAKwGbmNfjuIitKU.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-XPwB5aqreypf0egL4WqrdWe6QZHZO4zcq9Iqa6BK5ts.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests--Tm41ZrNkS9AD7p4ov3KK_unF5HxEGneqGLWMzuFrz8.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-YzTXXGHCa1lC3M_8an9OGibsrIwgNjgacMbtm_sAA6A.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-B6Gpvosf1uL8nO2bCBKFQrw72yg3OGKMEfR31dbZ-hE.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-wD_gX34goGl-pBqa5IW_RxEsFJ3tiXtNi61mv_SdXUU.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-9ObBagjqHiYcvCQqA8H6VXJFv_D8XSh0zEwm6XRa61w.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-l5Z1D-AqAfDI6uaYEazUrDiLGG9wWpDOSi75qOmLFnI.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-mCOAAp7TElVkwCB9EOt5Y3FKbjk4kvqWYbbRPLXWqpQ.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-77O9EsmPvgC34T-WGgUP_ODbd5U9yERiHB56X7ixHao.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-HS3VkPDeUM9hAdohtexyI7m9o54zphk1SFdiFR_h-_A.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-FrvFwF_hulV4EhzegYVuG8My2BLz7qnKe0eF4QGwkpw.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-Qlbc3aLivjZ6j2wl01nHHf0Q8IoWwgdIU4QroXClL6Y.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-jmr5ijWBkK61pqhhBiCXmjFFZL_Z56WYXiNtR_ri8OM.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-1Y1bPOn4eGJc5AjeYC7lqYrJ-1lvK2Yih-d98smpjn0.jar
    Oct 12, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-_ZUzwb_-4OEEjqFlzkdWhn49InZ32RILbwAXKequJj0.jar
    Oct 12, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-cpHH9jmaHJPm-dBmDF0t8_jOE8BBscy4TH_EPJqOwto.jar
    Oct 12, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-WKjXxxAOYy9kCpm50wNfLvVkT0HAHM9julw98OS9bTI.jar
    Oct 12, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-f2pW0Lq7yC8WvLnSZ8Vsi71VFv2yl-OMjipENbsjL4c.jar
    Oct 12, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 12, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 12, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 12, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 12, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 12, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 12, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96274 bytes, hash 63559b96bca44cad4c56c12f951749a04b02fc22c5553fe282cd8bdc97c20db1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Y1WblrykTK1MVsEvlRdJoEsC_CLFVT_igs2L3JfCDbE.pb
    Oct 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-11_23_45_15-8939381493223574305?project=apache-beam-testing
    Oct 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-11_23_45_15-8939381493223574305
    Oct 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-11_23_45_15-8939381493223574305
    Oct 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-12T06:45:15.227Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:45:23.615Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:45:24.321Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:45:24.398Z: Expanding GroupByKey operations into optimizable parts.
    Oct 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:45:24.436Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:45:24.510Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:45:24.549Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:45:24.584Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:45:24.641Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:45:25.019Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:45:25.088Z: Starting 5 workers in us-central1-c...
    Oct 12, 2020 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:45:43.550Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 12, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:45:48.066Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Oct 12, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:45:48.096Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Oct 12, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:45:53.438Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 12, 2020 6:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:46:10.062Z: Workers have started successfully.
    Oct 12, 2020 6:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:46:10.092Z: Workers have started successfully.
    Oct 12, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:46:44.473Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 12, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:46:44.617Z: Cleaning up.
    Oct 12, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:46:44.697Z: Stopping worker pool...
    Oct 12, 2020 6:47:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:47:32.124Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 12, 2020 6:47:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T06:47:32.166Z: Worker pool stopped.
    Oct 12, 2020 6:47:39 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-11_23_45_15-8939381493223574305 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): df926267-4495-4384-8649-6cb0e5a85ab8 and timestamp: 2020-10-12T06:47:39.464000000Z:
                     Metric:                    Value:
                   read_time                    16.253
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 12, 2020 6:47:39 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 37.898 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 24s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/dei3fbyvo7fde

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1104

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1104/display/redirect>

Changes:


------------------------------------------
[...truncated 271.82 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 12, 2020 12:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 12, 2020 12:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 12:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 12, 2020 12:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 12, 2020 12:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 12:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 12, 2020 12:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 12, 2020 12:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 12, 2020 12:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 12:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 12, 2020 12:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 12, 2020 12:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 12, 2020 12:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 12, 2020 12:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 12, 2020 12:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 12, 2020 12:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-xhvmZ6Usk4yF88Yfrk3CZmlAh5OM-UD-6xngYUC9Iko.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3626521704093454135.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-3aggzIm07RdrROX5Dh8jdaaxqCKYFDBwmLyPP8-8LcA.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-6ga8X4XUf-Bv9H3TTWBmeT13kT53ycbwU7cqkdMvHkA.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-LxyiYipKlIpeF3ukiDaK80_wH6ln3RvQz20_IDUmCOs.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-SQFKNwHlHPKc8l9U0ncxEDD6pEfrnXaKLGHHKG4E6_k.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-xhvmZ6Usk4yF88Yfrk3CZmlAh5OM-UD-6xngYUC9Iko.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-u2wLSJnjCa8aKIPe_ed2pe9iaPhLBNJmqnfG4lqvnFU.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-GW_VccCNnGepMIZlqmrSzbXv-Rqsuqv-P_swY4oc_tA.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-CPgmUH1xZLv2FykQELjFTzXZP3wi-iy0bYaDVa8Zr8c.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-VuzsxjUCvwHK8y3rMtOb3OAmPxwX9d2bs1R6Gm6bxHs.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-mCARnnF9gxOZV7J020lIPq1mWqgVt7nMvZY6ZmW9DTk.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-RitG_ip5aNJXZv1jVk1dID9N4r9A6aYEp4yytie3w1k.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-ekMQntlmcmAByJvDY8lbwD-Fn9yzUOUKq5H3KILF0DU.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-_HxNMLEBDKawVElOBCYmHlcur4aQRlKegFs8FWqe48Y.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-teED1oICBNYT1GHAVQBHwM_yPuuMbzHTGU8uKp2PxRE.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-WfymUPL6dR5q9trmBv5odrnW75Q7uydHUGGcATtkxz8.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-xjL_bZ7mOPO1Z4OU8qzSC1rm5uylzNKYAEB5oVtu5fA.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Uuuq1n53Mnr8eeECUYVLzapbIkxtghEE5QVIgiV3RsI.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-dm2kDJlODpupysFlUXoqZ43UuVJB5YhKex6YPBu7Els.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-RC2JjmZLs32qF2zW_MRGUsQmgJ0s9wwN9wKEVA1jP4Q.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-a8yMMVF6lxuSEhFo7XZw1wbDZ4zbe_tKKsUG4GmzY5E.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-6g4PrkgyF51u0hcGdbsaLdSnqupNNmu3S9CmveblK9s.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-EquxAXYo_cPGmD630wH_EnhIxgTqDfJ_9pxR3kiGVTU.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-eA1W2ZLgmy1HX5aR1S_dqgxd2JKEUoN01Llc5qgWy9I.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-IYKSNyiIRTy10dg3d-9HfbgUkfT0lcWYz7cCb4znjM8.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-kI9CG5p5eUOcWL26CUZEKehPzz7sF9NrXvL_PfQzy5U.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-GZvOS32vqJzILxjrbUQBao8b5W2HXz5saRrrz1KXcL4.jar
    Oct 12, 2020 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-4EpJltHUHO9tSS2afNoVi-GqVceGZzk34QaYtzalRFw.jar
    Oct 12, 2020 12:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-ZFAWZ3Zg0mJt7Ex1U7EgpIFESja-9goSmhLjFVQixzo.jar
    Oct 12, 2020 12:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-7cUlKTi7j8rOzx3eNdWyjk0jIqfnik6XCdSNTqh0LsY.jar
    Oct 12, 2020 12:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-DrBF6d0K5y31MB6WXytQiphG-lKWJkEtpOZrlutRX-8.jar
    Oct 12, 2020 12:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 12, 2020 12:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 12, 2020 12:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 12, 2020 12:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 12, 2020 12:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 12, 2020 12:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 12, 2020 12:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96274 bytes, hash 6813f3d98ad817aa8a48890d66a54cdff2dde71f3c7c67bdcfe7a1b2de554d69> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-aBPz2YrYF6qKSIkNZqVM3_Ld5x88fGe9z-ehst5VTWk.pb
    Oct 12, 2020 12:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 12, 2020 12:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-11_17_45_14-13955047462216579908?project=apache-beam-testing
    Oct 12, 2020 12:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-11_17_45_14-13955047462216579908
    Oct 12, 2020 12:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-11_17_45_14-13955047462216579908
    Oct 12, 2020 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-12T00:45:14.172Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 12, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:45:21.149Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 12, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:45:21.971Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 12, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:45:22.035Z: Expanding GroupByKey operations into optimizable parts.
    Oct 12, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:45:22.052Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 12, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:45:22.126Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 12, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:45:22.161Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 12, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:45:22.196Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 12, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:45:22.229Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 12, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:45:22.578Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 12, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:45:22.658Z: Starting 5 workers in us-central1-c...
    Oct 12, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:45:45.869Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 12, 2020 12:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:45:48.536Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 12, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:46:05.670Z: Workers have started successfully.
    Oct 12, 2020 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:46:05.707Z: Workers have started successfully.
    Oct 12, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:46:46.369Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 12, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:46:46.528Z: Cleaning up.
    Oct 12, 2020 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:46:46.607Z: Stopping worker pool...
    Oct 12, 2020 12:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:47:34.853Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 12, 2020 12:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-12T00:47:34.952Z: Worker pool stopped.
    Oct 12, 2020 12:47:42 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-11_17_45_14-13955047462216579908 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3e38683c-54ce-4a3f-882c-67b32457a2ea and timestamp: 2020-10-12T00:47:42.061000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    21.841

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 12, 2020 12:47:42 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 41.737 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 28s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/rfasvywd4vhss

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1103

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1103/display/redirect>

Changes:


------------------------------------------
[...truncated 272.33 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 11, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 11, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 11, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 11, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 11, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 11, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 11, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 11, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 11, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 11, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 11, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 11, 2020 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 11, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 11, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-snJMez2sNas9N78BQr0qDPwzvi9Q6jOIC8sjJ0pUeW8.jar
    Oct 11, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-jGk7CcV7zgWWboHW5KyZy38hbsj7Bi_O71w4KZ71G6E.jar
    Oct 11, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-TefDDli_c6QteqVD5Zjxu5tsHRfYu0bzTy3PyVFbAUQ.jar
    Oct 11, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-sRjujICNHVcVbAiXy5Ds397aZ-X9AqZKanoU1rXFJ2M.jar
    Oct 11, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-MtUxPmehEqAE6BoXNWYWvW_UQmWtCDvpSYQ26HkUzYg.jar
    Oct 11, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-BXVfcgiDxQ-_XJRWz74FZPkoEvCpsKAPxzKAEqPBT64.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-njUGrXEIri0_b32bfuQV0OoPR3Q2c44Rv1o1IbamtTo.jar
    Oct 11, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-U-l_OonLPZsmV8dYwtpLK8iB4ZBsaVajdHDFwRUiEgA.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4165025435705097423.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-zOoxmdOHBuMaZWChtxalHHLyz0BOzQ7KNUTy1OVcTDw.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-E_5PQq--okKkLJsWg_xYXMjP8b3oKKlSxmUZN7k6G8c.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-H62tA3Zt-cW7b3U5wR-G8OtyJBipTi05jHF8NhW-hFw.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-OAfqr5LYxiyoEe8CghpKOkuqdSxYwIXGpJm75nze14E.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-G10T4EChR7CimK2yD0CHrcJJnsZrUJHnHhryW_CISuM.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-YDfUxGG8tK96_0J2CYz2QnmdgJIi4o931dXxZlA7NW0.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-JtL4sOxP6UjXbEWhxvuy40j-TktZKh537dAq3DfynFA.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-a1oMgweA2U0_M8PA2odt3Ja2aVnhhB5SOactJ6hqbmU.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-CBL5IoZqK0Ilb8hltw4dHQTxN3lE98Mfw8zpytEgVZY.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-snJMez2sNas9N78BQr0qDPwzvi9Q6jOIC8sjJ0pUeW8.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-gvJnSE7XBV-Fuv7MM7xUWpGXkIxV60Pg4gZ1y8ullJQ.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Kv1NXjKZcYAI-pdfUc0l0EytEC6vf4dr6m6Hhv0cy9g.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-4KqfQqz0g-E8OAFR3nKeZnmeecg4n6IWOeeR7qXBqPc.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-ldFNNAqtMbsbemYy1YyJW6ryVrmq4beMW020UERRK2s.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-cxQChyulZ_S9C4uqQKQXbblxMWLRbCWoARCBo8lCSDo.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-y1mEpzVuAeUKO4weSXWalzYwoUTfn4xVR5o7anleOSc.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-RZBQWbceriN9-OZYS2Aky88kRk0dxFkcv8hmsyclL8A.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-r5yPn9_TCoeWU6OEWfkRTwoLW4E16H7EIr3chtx7lu0.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-nvFCd7YZhXPZLI8VdDJTijejUqbjEDmD1IcQqUZfywM.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-LVL9erqraOaWDcnf_JKUj8n8pvUazivB0CRbdpp1Zu0.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-B_ZYSvLtCoKvJLboxz30t42vCjUs-UYLEaX6m2YJMXY.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-BJ2QyCAK8B9KyL-Qaa8nQzTHZzzZwL-jeRAm23rI__4.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-n9P2_ON_uB7Fw98AKFYDzu6Ya42hYTX1xtohfgB8yLk.jar
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 11, 2020 6:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 11, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 11, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 11, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 11, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96275 bytes, hash e99b9714df94cc58bbdb7357f0414c5b73a55b18e9b07363c468685bc81998e9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-6ZuXFN-UzFi723NX8EFMW3OlWxjpsHNjxGhoW8gZmOk.pb
    Oct 11, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 11, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-11_11_45_15-547679426107809762?project=apache-beam-testing
    Oct 11, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-11_11_45_15-547679426107809762
    Oct 11, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-11_11_45_15-547679426107809762
    Oct 11, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-11T18:45:15.318Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 11, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:45:21.891Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 11, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:45:22.501Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 11, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:45:22.542Z: Expanding GroupByKey operations into optimizable parts.
    Oct 11, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:45:22.593Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 11, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:45:22.671Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 11, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:45:22.707Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 11, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:45:22.739Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 11, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:45:22.784Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 11, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:45:23.241Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 11, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:45:23.320Z: Starting 5 workers in us-central1-b...
    Oct 11, 2020 6:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:45:48.930Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 11, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:45:53.274Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 11, 2020 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:46:06.893Z: Workers have started successfully.
    Oct 11, 2020 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:46:06.923Z: Workers have started successfully.
    Oct 11, 2020 6:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:46:44.341Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 11, 2020 6:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:46:44.518Z: Cleaning up.
    Oct 11, 2020 6:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:46:44.592Z: Stopping worker pool...
    Oct 11, 2020 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:47:40.958Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 11, 2020 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T18:47:41.010Z: Worker pool stopped.
    Oct 11, 2020 6:47:46 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-11_11_45_15-547679426107809762 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): dd472d14-ba75-4354-bc43-96945b4e167f and timestamp: 2020-10-11T18:47:46.225000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.939

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 11, 2020 6:47:46 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 44.635 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 30s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/3fbisu2chhxiw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1102

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1102/display/redirect>

Changes:


------------------------------------------
[...truncated 272.95 KB...]
    INFO: BigQuery method is set to: DEFAULT
    Oct 11, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 11, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 11, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 11, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 11, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 11, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 11, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 11, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 11, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 11, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 11, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-bTAeV0ojImU9zVS3QktayPR_m6ovVZNljQyr7mmW-x0.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-JWfN2OVQHPHRb2JFKeWwdE0fXL3id37szmpF-s1Or70.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-7qxAsWgHdYLFYB9nwvOzGTIud60g4KCzOPig0in3Ak4.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-6Q46voaA5gH7zHSejTOf_Hwcw6BYAu2Wtr38fG_Vjc8.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-QIJhiUPAsgo8D0Pl9FZ6-rwmwpdGC-mNqBKlKCYy4vk.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-MoDuRjbUCiDVE_tW3oNpiT4JVGwXipbVWG2MEoUpFMw.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-J7ZurpJ6c1Dm-a6Z6CTF2hi3o_PhquiGyyDejc5HZFA.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-sFNaz9OTzos019UV20vlsqbLfp4QFPHkP99Tzi012x4.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-l1eeXmC0Wqf86YUw441YONl-nnG5jg-FzTrM1wumOq0.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-wnrzrNyEutnKPmUAISwtMwkr8kpJd4QWIxeKOYKTCH0.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-gyt9LcKdv8E7jXjaLIGKwxeUrOAy13pBcP8PJT6_7Dw.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-ARzkHbXD-JEVgyphIPNuU3iYFROD8B6UN7ZMRZq5pzk.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-ddEfXcgk1Y-M_R_RgnLY-TDlSW9IBhTJhhXyiOjQUE0.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4409654784848812877.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-MB9hRV8UqH41IZt4PnUeFSn8DAao5bTZLragXprxNaM.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-5RxTCKIyzw2y8LvBlb1oZqxzsoOPyx-FINQlhQ-mxzo.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests--RTO_0LFNaV8nWJfzm3iJ7sM1lIQlWWEwiqoDFh29bs.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-yFwCRC_UHxgpTbHQ0JDUvJsArw2YcjLzyKwSGjZguMg.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-GwPgWpkdPJ0O4kn_Cs_Mh0szEip4WbCrHzgPHxA71oM.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-ZlRn73PNK_8aTqEorgejF4kjPvNzecbu8gYZoy00Ow4.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-bTAeV0ojImU9zVS3QktayPR_m6ovVZNljQyr7mmW-x0.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-4TeOzjDeR0vcNUWKwV3dNh18Xs7YxeCKhaNbMNq7nF8.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-TBQy2xdns0UubJjwwjkOZxueEDhY8lTe5hL4nwCFZEg.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-BYQMDIWzhswNsd_P6U68YL6N7FqQSuqJxA_k1tFZupc.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-Z8zlqzshR_WB0YNYA8SA5QPvXsb9_VBFflL6IuteeGs.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests--TChbLsnjovaTQF25kUppzN5FOy9W0s02ldIMmSnk98.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-CklhowqvIoo_0HTmKiNKF2Va0rZUadZrugRMo_J6foM.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-9PeTACXALSlI33gW_OHPV4V3ayVcOf2u018p7S_d4sc.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-FeJMBc2vGwM6Nu5xzQGcfllPrh8EWmE_Z1Fv_jFT3tU.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-AJ6GhWqox3c8frps8VB_y_2wPtEpPOWdh6HR8hQtWH8.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-olZZoKdo0IOAMXKTlfmPHrCd04FEfe8W_s1IKxM0ewA.jar
    Oct 11, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-TuUCS_i7DAfI5euXJ0DtDIa0c6fu0fDEqloHk6fGGvg.jar
    Oct 11, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 11, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 11, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 11, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 11, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 11, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 11, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96275 bytes, hash c317c729bb7711b6522e2335c000258794a01c8d2d42ad41c91c7d50fc562c03> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-wxfHKbt3EbZSLiM1wAAlh5SgHI0tQq1ByRx9UPxWLAM.pb
    Oct 11, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 11, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-11_05_45_16-805215841596731730?project=apache-beam-testing
    Oct 11, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-11_05_45_16-805215841596731730
    Oct 11, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-11_05_45_16-805215841596731730
    Oct 11, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-11T12:45:16.810Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 11, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:24.918Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 11, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:25.616Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 11, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:25.656Z: Expanding GroupByKey operations into optimizable parts.
    Oct 11, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:25.691Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 11, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:25.844Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 11, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:25.879Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 11, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:25.914Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 11, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:25.946Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 11, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:26.515Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 11, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:26.612Z: Starting 5 workers in us-central1-c...
    Oct 11, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:37.071Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 11, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:52.006Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Oct 11, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:52.069Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Oct 11, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:57.378Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 11, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:45:57.416Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 11, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:46:02.724Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 11, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:46:17.277Z: Workers have started successfully.
    Oct 11, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:46:17.313Z: Workers have started successfully.
    Oct 11, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:46:49.104Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 11, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:46:49.283Z: Cleaning up.
    Oct 11, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:46:49.387Z: Stopping worker pool...
    Oct 11, 2020 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:47:45.158Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 11, 2020 12:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T12:47:45.208Z: Worker pool stopped.
    Oct 11, 2020 12:47:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-11_05_45_16-805215841596731730 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): fd4f282f-787e-4f66-8b71-f0986a40b24b and timestamp: 2020-10-11T12:47:51.666000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    11.218

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 11, 2020 12:47:52 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 48.223 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 36s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/a7xz27vwskfk4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1101

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1101/display/redirect>

Changes:


------------------------------------------
[...truncated 272.13 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 11, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 11, 2020 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 11, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 11, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 11, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 11, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 11, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 11, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 11, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 11, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 11, 2020 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 11, 2020 6:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-tGrI1taanTB0R2E-OCtc4jbAs4cYRu8HAQbl6KR50Dk.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-9LP3zW3e-HnmkPZaR29SozePWIxvHV9RUlGtNDqBm_0.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-8urdjU8lBl9J77x63Kh2Mfgh6vEH5Ir3sFJsGeDcy-A.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-fn_xh-GJNwwHjlmAHrukZlZD2-JZXkUWjPSRAYxnPwI.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-idSNyNUHpZCDy83PfXhThmnm40g0PM3ReOfPYk1ncwU.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-21hZZihnnUxS0g3cXRd5Hl4zWeNMInGVY7kRWjCByUY.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-_fP3IjgmEkAtpxmAXl160ebHa7EtrxlSdO7yO_GC8-Y.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-fPZAs60Gosxx_z9d77yqzytMUA3fg2tGp3_jIo3aa9Y.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-bKDKdyBSdLsFm7gEDMvUx43iCXVVPyOpUpl5xeOKGWY.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-gVrDb_o-wzu8ElbSo7lx7svt5L66SezdMjYbGE2c_zw.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-m8WxBRA4ozTKM65lbQm_8IxkrWTA2hG1xotS6Aonx9o.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-tGrI1taanTB0R2E-OCtc4jbAs4cYRu8HAQbl6KR50Dk.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-ykHGU7erPpOwBUxwX90dyA2MgRaORDO02429FLKmPx4.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-bF_SIGjQ1Qt84QfhpslP3tHP9pygw7reDicNOo05tEs.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-QDeqtrnKY3WLb-U0hMu_rLZvON1kkaI9tkxOngLytbo.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5424754065583501912.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-VOsy7Dbn3bXJaXsRE8-od39rnRwvEgdAT6zu3E6LDKo.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-coPCR5f07UUCs3ZK5NlkkxWuMe_grQ4WKn6P6SOytWM.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-rAs4PKo-VBxmivfH1QmgmOAhgz57LgEupun4wghqbcc.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-XheH4S54P3uCfANxKu763RaYXwpBZkBP45HPqKOyIDg.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Kctt3SExyPoPu7fQkG_K_NaVbFcnVzSjrPUcHvX33oU.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-ms6jQdm3qMFuhJi3mzYS_p88yA75yZR5jPjfWuTohAI.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-CVjcIEh91c_cwrGdDdA8cq-mUPSJjT8bUZ2SNyOVxP4.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-Y-xKfm7rsztgxAnCwbWTsLx3bpWXfxirZNKuTcsYySw.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-3bxIWb0T-0xlAR9o1u8iBF0JaNeFHjbkBhem3bCh7dk.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-IwhTzWiVJnx5DPO3I_Xt4dQTx3UwyauwXtD4nBc93Bw.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-nGSmvQ-rRDNRPx4LKetUgmEunronVBfrJlYhNDyUX8w.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-W3bQO4IDqigWDcO9qLq51t-scJ8iobTZME3IXk09OwM.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-sMZm-_lnq4_0F7qmh0cTlUknr8DeEBm6AfBS_rhS0u4.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-o2YEVxnaDj2HhZn20-BGDulyvU2a2mIWYZD6cfk_3sY.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-tZOMNoXs2s1KR9-EvYbW4bWJpVSNvZgZOBjaCsgobTI.jar
    Oct 11, 2020 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-RB06NNBnw37u7OxQ39UNW9v0b8T9qhwJi3R6Qk_Vf_8.jar
    Oct 11, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 11, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 11, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 11, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 11, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 11, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 11, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96275 bytes, hash 9235a28ca6009d1935c78fe31f90e12fcd1433919929fba621a5ec38c81dbd4a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-kjWijKYAnRk1x4_jH5DhL80UM5GZKfumIaXsOMgdvUo.pb
    Oct 11, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 11, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-10_23_45_14-10203789929115180699?project=apache-beam-testing
    Oct 11, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-10_23_45_14-10203789929115180699
    Oct 11, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-10_23_45_14-10203789929115180699
    Oct 11, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-11T06:45:14.545Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 11, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:45:23.369Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 11, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:45:24.003Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 11, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:45:24.044Z: Expanding GroupByKey operations into optimizable parts.
    Oct 11, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:45:24.071Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 11, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:45:24.138Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 11, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:45:24.223Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 11, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:45:24.251Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 11, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:45:24.287Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 11, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:45:24.755Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 11, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:45:24.839Z: Starting 5 workers in us-central1-c...
    Oct 11, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:45:37.313Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 11, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:45:48.192Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 11, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:46:15.059Z: Workers have started successfully.
    Oct 11, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:46:15.090Z: Workers have started successfully.
    Oct 11, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:46:50.072Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 11, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:46:50.200Z: Cleaning up.
    Oct 11, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:46:50.269Z: Stopping worker pool...
    Oct 11, 2020 6:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:47:40.202Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 11, 2020 6:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T06:47:40.241Z: Worker pool stopped.
    Oct 11, 2020 6:47:45 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-10_23_45_14-10203789929115180699 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 15f5a25f-8842-43ad-8074-3e22661c1be7 and timestamp: 2020-10-11T06:47:45.911000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.805

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 11, 2020 6:47:46 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 44.974 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 29s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/ikqilucee2ef6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1100

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1100/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11033] Updates Dataflow Metrics processor for portable job


------------------------------------------
[...truncated 271.83 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 11, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 11, 2020 12:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 11, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sKtUnzgcMrDQUgKbdmeDCkFlUKh_eEyb-MtafRACxS4.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-ahan8RzE_pBqUoYdAwweyiKfV6pYWBfIs8GENjiRiec.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-I-C6v8ofUZp5d9t6v26UHeEK2x2gh1HedIZg4Q9EP9Y.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-KVHXkrWu6PXm1leLZhdin1SDbkc8TByk7GbDDU_e438.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-WeEhQAw1lcY42X3YQYZuZudAN5Lhm-g3MC8ZuzmQw7g.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8823527844561654722.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-XbiEzvhkJ20EP29ZQhC5gTLjXjHXJo8SGmyJpEJAN0o.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-LqBq3Qvkxz7m3w5zDVnwFxKtdD6r-0R1YM7xBxGthCI.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-4xw7mpXfO8XBAKKrctjX1zzIFi5YHMGibGYTKsg25Ro.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-CHslTQIIW5R6lDR8Vdp8onefJonhPqlPzeWd1aT3GBs.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-hwktOTvgkH-mLwdqNGDUzukeZI-GQbow290VGCYb_GI.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sKtUnzgcMrDQUgKbdmeDCkFlUKh_eEyb-MtafRACxS4.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-oLttMwJxNy9ihd5zYFTrFIGRA2vz2nHAyNIuZL0VVIk.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-phT-vVtoWJzRq2teJxnvk9vXx6xPL2THtwAr4oySMxo.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-DCXpCpInpp5nWamT0oFiYNDpiXlDvmJlDI5vCXeoN8E.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-LwqwxEGs2YrOPqEodOww_841AtmzcoMn5vduaVrsWjA.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-qDGth1ksd5a7DKdCvGT080ktEI_T1YZcnizHG5XaCSg.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-XwKu8-hzUpe3nP-mz7qhu6urjh6J2LrWB67CGD-wytw.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-WqyT6tX-wXrn4apCFAPfDy4d7IU4bfARsRnWSdaLcG8.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-wiC9OdRCVQy7Cn1VLme-Q9x7BrUmXq1uDjAbVEzqW4U.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-MO4dzm0O2zXLY-spc6BUa8FgkbipWnD8Y5YKmXdcqS8.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-Q9zWdBrtKuLomwETWPSBtq-B1QNspJX5UjO5KFAY0bk.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT--r7kMGSia8YfsugCRBfgYALLuuzRC3zQuoDsZ4DcekY.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-KQe-DYYendRAN4XS73jsM7EawfnnRczA6CBX42sJ4Nc.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-GiLt4CLOXnfb3OaCXcz-WaDXcFBtjRSt7aLs5tJhkSw.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-h4drwkGVLFIQzeewESSBHcXmHebTlz-qnpxAyeqSQ2c.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-fyBsZNiJoZPjuJNvThnabop06DI3oM9Ctu7PEDpQMk4.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-KltOoXuqVPW6AJStuR8T-eNdPLj8wV-h0oPTlpY9naU.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-4flR9IGC6q-YN_PBnSAgXX-JRSSFti3ygM7ut_h6ytU.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-9QDqwxAx6tFvFQzCo35tr8EPHPMCvPcr6xxhleYIKw0.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-itwsT56qqcbq3rSW5l2JugDNekaO8tsVgebQ761txqo.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-d7NOO8nhOtpupSIz2fEQp8-vonQuHk4ONyIKFDGd79E.jar
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 11, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96275 bytes, hash 1ffb8fe6d0687fec9524e0dd03e5657d7fdaa2f1951d74444aa545ee835b554f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-H_uP5tBof-yVJODdA-VlfX_aovGVHXRESqVF7oNbVU8.pb
    Oct 11, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 11, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-10_17_45_18-18041024337860533441?project=apache-beam-testing
    Oct 11, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-10_17_45_18-18041024337860533441
    Oct 11, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-10_17_45_18-18041024337860533441
    Oct 11, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-11T00:45:18.429Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 11, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:45:26.772Z: Worker configuration: n1-standard-1 in us-central1-a.
    Oct 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:45:27.527Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:45:27.569Z: Expanding GroupByKey operations into optimizable parts.
    Oct 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:45:27.598Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:45:27.662Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:45:27.688Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:45:27.722Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:45:27.757Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:45:28.168Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 11, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:45:28.249Z: Starting 5 workers in us-central1-a...
    Oct 11, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:45:40.737Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 11, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:45:54.551Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 11, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:46:13.294Z: Workers have started successfully.
    Oct 11, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:46:13.326Z: Workers have started successfully.
    Oct 11, 2020 12:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:46:43.921Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 11, 2020 12:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:46:44.104Z: Cleaning up.
    Oct 11, 2020 12:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:46:44.185Z: Stopping worker pool...
    Oct 11, 2020 12:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:47:36.689Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 11, 2020 12:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-11T00:47:36.727Z: Worker pool stopped.
    Oct 11, 2020 12:47:42 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-10_17_45_18-18041024337860533441 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b8d23df2-e845-4f85-8980-c7b0dab8274d and timestamp: 2020-10-11T00:47:42.902000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    11.578

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 11, 2020 12:47:43 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 39.437 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 27s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/zjl6yphmzkfp2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1099

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1099/display/redirect>

Changes:


------------------------------------------
[...truncated 272.56 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 10, 2020 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 10, 2020 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 10, 2020 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-ntjziMtaYcPv3cvR-I8QOT-o_BI0hSnBMue_tsk0J0g.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-iScbmDIANsY_he77lBebFmTuvzADVa1AjQPedVgc78Y.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ESZ3kEydMwRVZs5h3RqtULBbU8MHncu5ZjkeJpAGxTc.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-Jf6G7UG3LxjkJSoIDFSLyqMPClhPQzpfexXqg-Fn15M.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-ED3chUMCSnONSS70g8BgLGgMb7ciLVCzW36RBc7f4do.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-H33QfDd8FzNl-2PrbqxX_Y8kdRhnbzkgMU4rxmiSrxU.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-pIztxtfXtp_T2U13SeJnA7fwnTKxS5zXPvIXPmCr96M.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-2K40wnyEuuqoEGcAVsYNRj2a9hFeo7wisN6ylsJk9uM.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-ADAPqD9UYfLMWEwxxw7mkYAAbkCAZHtWcIr0445DLxE.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-DebPktJXTN2Llb8ik-pzX8vMxf9Mqopchzm67l4vjKI.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ESZ3kEydMwRVZs5h3RqtULBbU8MHncu5ZjkeJpAGxTc.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-21yqhevgBduvslKFTAFeXYd49qMceEjdIWxFpV3rWF8.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-32NK7C-LdujkEKxoGFz_SvKpSb6yEf9OUD7OWun8PY0.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-ztlg2lRj8kxN45Z_KxvK-6AY8PaQek9IhW8mhkAcJ2Q.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-4GLwh2NA1OjIw5zhSAyFigHlhn_zts0_g9nfPAXtciE.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-oYEBJwTTjvcnCL55lbp-qHFx8OPkX933jdBahsHGFIk.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-QjZE6lcr-Pgp44pO2IP75jlyxYoxDjj8xEQ12UqdPIY.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-j-2vcenDoi1g9tWpCMzDUSnvsm_F0ndxsXtDEEi1x8Q.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-NacMamf3l_IHY5zTcBlprGi99tYZTxziAsMkbnNRw2M.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-IjkEUMGFo7P6SNYpBZVxteqlYZjlT5LE2zTKZzy07T4.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-0R_m4HQAkMong2xsBWlMVq-hn9fmDt6pw2mKw8Tp3yM.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-CfKP4aYbpk5dFpTRel0uQTH5t8OgvUY_vJTbmoO7DdQ.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-v601P4dKIMs0ULwn6-P1QiQ6iBr49EBRbXKAEQgzGH4.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-_iG7YcxUdGER8-f-T-OsFTO7X371q-fDAT3wmS7UT7U.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1300378360137826528.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Ye-Y28-AHWiTC8ZpItSoJk-5TCsekMeNWxlZOUxJvoo.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-zVloAP9u4uN_xoIMyg1i_4mr7NfoEyqHBqZn3w3NkuA.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-TqK-WGt7q4A7CmqL-anlcZHX08v0liwFc42bTbB2o94.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-GGpwcPHXNhpVltzQBb3_g6Du9K9OLDik6hFBpzWKy9I.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-D-A4OB4PyTjSlxC_eqF-uYL4p8C_xa_XTuXHzq7auzo.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-hEsC-DSvL1lulKgkpqgUDEneByACu4cJSHUoqbm6sfc.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-7pCvj5Han7_ZxudMzkDFqvTsagPJ_CwLLwTxb8HiI-4.jar
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 10, 2020 6:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 10, 2020 6:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 10, 2020 6:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 10, 2020 6:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 10, 2020 6:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 10, 2020 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96274 bytes, hash 26c1ac70d7de923183ddd648eba278deb989a144b93b6616e6501a9dfd843e03> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-JsGscNfekjGD3dZI66J43rmJoUS5O2YW5lAanf2EPgM.pb
    Oct 10, 2020 6:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 10, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-10_11_45_12-3839466732861441718?project=apache-beam-testing
    Oct 10, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-10_11_45_12-3839466732861441718
    Oct 10, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-10_11_45_12-3839466732861441718
    Oct 10, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-10T18:45:12.789Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 10, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:45:19.757Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 10, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:45:20.400Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 10, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:45:20.441Z: Expanding GroupByKey operations into optimizable parts.
    Oct 10, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:45:20.470Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 10, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:45:20.537Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 10, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:45:20.564Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 10, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:45:20.591Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 10, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:45:20.626Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 10, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:45:21.030Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 10, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:45:21.097Z: Starting 5 workers in us-central1-c...
    Oct 10, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:45:41.957Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 10, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:45:47.134Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 10, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:46:10.600Z: Workers have started successfully.
    Oct 10, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:46:10.638Z: Workers have started successfully.
    Oct 10, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:46:46.140Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 10, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:46:46.273Z: Cleaning up.
    Oct 10, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:46:46.349Z: Stopping worker pool...
    Oct 10, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:47:28.638Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 10, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T18:47:28.688Z: Worker pool stopped.
    Oct 10, 2020 6:47:34 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-10_11_45_12-3839466732861441718 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 68bc8771-6cd9-4ebc-8c10-f189a23b8882 and timestamp: 2020-10-10T18:47:34.374000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.434

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 10, 2020 6:47:34 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 35.358 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 20s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/ntnonrpsexgtc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1098

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1098/display/redirect>

Changes:


------------------------------------------
[...truncated 272.83 KB...]
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 10, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 10, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 10, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 10, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 10, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 10, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 10, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 10, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 10, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 10, 2020 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 10, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 10, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-c-Ml0jIGpNqt8FhSgM6SvO-NHm-XfhzxbsngSve37nY.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-ON0i-2knZ1s12FPVjctPnMkBx3tegFjXbNByAEGX4gY.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-c-Ml0jIGpNqt8FhSgM6SvO-NHm-XfhzxbsngSve37nY.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-PMQNbOUIVznd6gAryItY8eRGcDfDOp5nYapKk6rCZeI.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-O5QPO3HH5Lhn6Dr3VWHN475CB6a2Vev7DwctZUUa6vY.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-dOKzfIuyz49uDad3urCsXvoySLeEyLVGhLcIh0SpBUQ.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-LavNg4qKHtOEvdqtA90ytQKIv4j8HMFlVtFFgxHsvFY.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-qQrtfmmXQSYT5cNJK2JHKONEk9DpeKXCB0EHH1lu7aY.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-E9LP8-wlOeOiBEgBxvbq0-Q7cDCH-r1opkDSwCCoqLg.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-G_KYE-ZKxQ5RLRLIMc3gCT8EPfLMEXjY4ty78Q_AS7c.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-YzJGLTQkvu2kPgNAosPN_lv2e294Gqx_1H29FRf0v3c.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-PhRG8_QwuwRzNwX_Z2XbmD9HIQ0OoyB1d1A76yeFJtI.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-dGezWepO_SKK-0XhlghdRno-V0gp_xFhDfziMnt_EFo.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-z2BOKpK2RI8qkl-O9d99-73wfe9fqe3fSjt5DfdOk_s.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-6ULUERkzuhxoSDFuYYJgNgNgyhaPvCFqkfGv6P-gJvU.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-QxHQ_AOhJ38FL5wLk9MF1ja_8S0NgIuc7zi-lNEM0YA.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-5pUAY7evxGvP79fx1zEeyv9AVY-wCRqP5P80G78gWqI.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-iTKOIz6w_poAgUJMD-lXZ5qeLoHigt8oyuUMsEIb9Ic.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-JeqX8ipPi_XHoPfEeMhc_I8pzZm-veLzsCHruNwSNJY.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3047129993201080723.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-icZrbEdN0CYc9bSz2S9I4MhwkjmVplLSWxOejSseUbg.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-2D43x6z5NK3veIIlGLUJtipfe5FHUuNH3om5RYESrRI.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-eKTUrjfFBz0wUHPrjTLba_bUvlaToxYqq38M8rle2wQ.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-hAjQFOhGJon8IioNHDjiAjIijoiZq4J0br6SYeQcnGA.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-NqYAUUoVxc3CD1jH539VSqERAVcV8HTbW6gms2pKo50.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-drYw9e-iR_tpm6J_B1vkzKrXCUpG6dPmZwTihTIJGIo.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-7L9O9-90s_bIF6MAYxxIrgzEZ42NBrKJT5ADn0nTXcQ.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-muMy_0vNmbWgdESFkk7d-JdDluqYw6aboqNLoxpcbLI.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-F9mni6nurOVeurQUyJWzYWKk4Ja8pP53yObZAkHsA44.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/kafka/1.14.3/b90885e30e86eb454e7b0e8e580cf59616e9de39/kafka-1.14.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/kafka-1.14.3-ITKoa6D-lUtLem0itUO_R5B5PqtFpvvOjJf9bTByVUk.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/testcontainers/1.14.3/71fc82ba663f469447a19434e7db90f3a872753/testcontainers-1.14.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/testcontainers-1.14.3-pumfUOVGxIRJ3HrNRtmRWTZcp5hY59_klYGQF6d6puQ.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-vDNhjXxjbkwVrv1du-IMZdkeAZiuPSXEeVspuGAC8Qw.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-03Sg8GarTl3XTw3Otr0GYzMKKpXz35voZ8hFrkVRM_Q.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth/tcp-unix-socket-proxy/1.0.2/cf53989130986c60113032e25185f4496ffbc186/tcp-unix-socket-proxy-1.0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/tcp-unix-socket-proxy-1.0.2-KCMDGSy00Z-cM3qmB0jDR5sQSJV17Q8Xofxkoxij7SM.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-eAd2055i51FCOf6vlQpjuVSVOyKZKABu2QVnLT8VuaA.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.jetbrains/annotations/19.0.0/efbff6752f67a7c9de3e4251c086a88e23591dfd/annotations-19.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/annotations-19.0.0-Ev8B7q8MCcamjy7AJLO_n6TK1uaLdLlov2LH91kEcDI.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/5.5.0/e0845217c4907822403912ad6828d8e0b256208/jna-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-5.5.0-swj66_5O1AnehBDgpjLRZLISawNfbqz_lo05CMr7TZ4.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.duct-tape/duct-tape/1.0.8/92edc22a9ab2f3e17c9bf700aaee377d50e8b530/duct-tape-1.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/duct-tape-1.0.8-Mc7xLd7JedH4bXz3CMQaF9pSPQXGhf1mQunQsq3bckA.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.kohlschutter.junixsocket/junixsocket-common/2.0.4/b4d1870bf903412533e0b79c6fcd402defcfc05b/junixsocket-common-2.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/junixsocket-common-2.0.4-r8N2Fez3-t_3TSmvtEP-T2M9OWZG2J2CXoIkoneDn2A.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.visible-assertions/visible-assertions/2.1.2/20d31a578030ec8e941888537267d3123c2ad1c1/visible-assertions-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/visible-assertions-2.1.2-RQSulosjfNzcto_1sHqmOr5JkvkHp3w9YSCqm5BBQBw.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.kohlschutter.junixsocket/junixsocket-native-common/2.0.4/726bd66a934dea39c817382986496fa4eda96411/junixsocket-native-common-2.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/junixsocket-native-common-2.0.4-92O4XsFT2VMJB0dOfyBspSsocDfXBLrO3aON1cTQ9gw.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/5.5.0/af38e7c4d0fc73c23ecd785443705bfdee5b90bf/jna-platform-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-5.5.0-JNgWIfgqwp_N2adBFgMfWQeiNDFY5hb0Vzu_okNK4NU.jar
    Oct 10, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.scijava/native-lib-loader/2.0.2/1451fa03954c5e31a358b411147de472b4dab92c/native-lib-loader-2.0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/native-lib-loader-2.0.2-5WfHHp8_9T94vVj9a6bUcc4x4SY_XofR4fzF0-2h4kg.jar
    Oct 10, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 179 files cached, 41 files newly uploaded in 1 seconds
    Oct 10, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 10, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 10, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 10, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 10, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 10, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96275 bytes, hash 4d1d0f30c0c23f1813cfa9348969fa3003dfbbd01d0b0428a19876562c651112> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-TR0PMMDCPxgTz6k0iWn6MAPfu9AdCwQooZh2VixlERI.pb
    Oct 10, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 10, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-10_05_45_15-15879941638525487593?project=apache-beam-testing
    Oct 10, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-10_05_45_15-15879941638525487593
    Oct 10, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-10_05_45_15-15879941638525487593
    Oct 10, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-10T12:45:15.973Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:45:24.392Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:45:25.077Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:45:25.113Z: Expanding GroupByKey operations into optimizable parts.
    Oct 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:45:25.143Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:45:25.221Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:45:25.257Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:45:25.290Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:45:25.326Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:45:25.668Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:45:25.745Z: Starting 5 workers in us-central1-c...
    Oct 10, 2020 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:45:46.921Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 10, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:45:50.753Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 10, 2020 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:46:17.187Z: Workers have started successfully.
    Oct 10, 2020 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:46:17.220Z: Workers have started successfully.
    Oct 10, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:46:52.906Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 10, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:46:53.077Z: Cleaning up.
    Oct 10, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:46:53.162Z: Stopping worker pool...
    Oct 10, 2020 12:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:47:44.926Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 10, 2020 12:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T12:47:44.969Z: Worker pool stopped.
    Oct 10, 2020 12:47:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-10_05_45_15-15879941638525487593 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4c7e52cc-4f8b-4481-956b-efbd0030f77a and timestamp: 2020-10-10T12:47:50.374000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.944

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 10, 2020 12:47:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 48.228 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/b4shjrncle7i2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1097

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1097/display/redirect>

Changes:


------------------------------------------
[...truncated 272.31 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 10, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 10, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 10, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 10, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-BeqDBfrHwRMBkCtBt7rb_DSpSIyv3zov79l6jkeIEPw.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-lKVpF6Xlp4E1kW-jdvK-tJVfvxIereXOd6pUinUVgvw.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-ubICCDQBiLYE36UJrFgnK_ygnTHObj8QZCtbR5faBBw.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-sWDTuVHfBu0uaP8gQvsRUIV-bWakAsy1hocQvbrvnQ0.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-aJAh7-maER4xcuH3q5qEm3VE2ZsbdPGZY9QiOGQb1yg.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-BfyKwCk-NOTxTUf9cwHWuhC-Td7y4IX1eJ74hDeZlMw.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-UNbJlLRogR8TlWDoAI4pXj1_gOB-aukV7LaDudPFO5g.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-TqMkgdq3dFRGeS30PepIEH_pszzNEYVK-GcVVYqEUYo.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-nPbLxBQ0bODDZx3-CtHBXeqyz6e__yfyAgqxt397T-w.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT--EXNxdbdhq57UVmV90jRsSuJ5hQAuZb-9RzcxhzZsnc.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-FWBLdw6voCmgjnJrpN-u87b9ceYHRdU2nd87Z5p2HRQ.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-UNbJlLRogR8TlWDoAI4pXj1_gOB-aukV7LaDudPFO5g.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6987131328392769710.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-FZiJiTeZn4Mh7M-yuJ2Bk-RUSQNnIShYKJgrL095V1U.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-V7MR8-TKquV3XDRoWXRFJZ8gO3v44PxvA4XGVjO2Ig8.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-cdUfylU8Oya_AWnwcbw0pNdrbvN5jPIw3hhaQ8Znq6Q.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-4FHR5yeaOpTAOA-EU40MT1EPvSi9khU6dGJ9u5B61yA.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-Nm4BRdvhe4IZ7GJQt3022IcFXo27kcn6B_JCUrGYB5E.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-nzDksm4whQs1bxJPm--qMiyk5qGwloUnEOTKWz80n0U.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-pXvJdMcTp0wJNiVxzf3J2_-L3tsmJYr3RKPTQn3vU0g.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-jGiUMKtPJqerOZlak1CEr9tqsk_1Y4tGCfrvonSh2_w.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-jvZ3Zz-8B5-pk6qY9qW8SrJpSST62F0akzrvVlritLY.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-BDN4bZSml_Axhh_Sc-8ZkDrfQOp4fEtKgfE6jNfsEHo.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-Cj4NLJityVoddihEW3m5ST5lWIa-FgSLqDOKwM6Fk8A.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-LRX2CEs4vluPmBVnO93PFrfKMT-oYdHhRmbERjlPjrk.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-ZDurHq3lw-PLQD5Gls86MvXtxV4eIO3KLJW5CFJAU4k.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-OFNwUTF6a6YjwRNa4CdD-zeBhZe197polCBBzi8SFHc.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-PzIAp58qf-JeoJxVJPFxGwebSmHHeCk-IOg9LW9KVbg.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-WtYZEIQyyAY1XX3dwrwsHb8e34VgUCeRb0yw7vgSp00.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-0jZE_V5C0j_XkBPLuL5FCNp-BlR8AQ7RjqWv8hYd80I.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-sN6cbkFeUTkbzlUCtiA-KadzEe7KbqcouhCLCX4v7YY.jar
    Oct 10, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-cn7PtNd0LRjU183a8_AVDbP0Gx48syTla1L1TYBXy7M.jar
    Oct 10, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 10, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 10, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 10, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 10, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 10, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 10, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96275 bytes, hash c43c1094fcb36b1808565c8922321c3f3c824749db3e7434e1da334e78370a2a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xDwQlPyzaxgIVlyJIjIcPzyCR0nbPnQ04dozTng3Cio.pb
    Oct 10, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 10, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-09_23_45_16-655017934679905775?project=apache-beam-testing
    Oct 10, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-09_23_45_16-655017934679905775
    Oct 10, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-09_23_45_16-655017934679905775
    Oct 10, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-10T06:45:16.851Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:45:24.502Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:45:25.196Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:45:25.238Z: Expanding GroupByKey operations into optimizable parts.
    Oct 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:45:25.270Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:45:25.346Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:45:25.376Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:45:25.402Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:45:25.432Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:45:25.837Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:45:25.905Z: Starting 5 workers in us-central1-c...
    Oct 10, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:45:38.828Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 10, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:45:49.379Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 10, 2020 6:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:46:07.657Z: Workers have started successfully.
    Oct 10, 2020 6:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:46:07.682Z: Workers have started successfully.
    Oct 10, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:46:43.650Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 10, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:46:43.783Z: Cleaning up.
    Oct 10, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:46:43.854Z: Stopping worker pool...
    Oct 10, 2020 6:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:47:28.121Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 10, 2020 6:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T06:47:28.153Z: Worker pool stopped.
    Oct 10, 2020 6:47:34 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-09_23_45_16-655017934679905775 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 539eb239-b551-4c06-896e-239ab5a52a3d and timestamp: 2020-10-10T06:47:34.589000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.741

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 10, 2020 6:47:34 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 30.895 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 17s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/qskfqxitsae5q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1096

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1096/display/redirect?page=changes>

Changes:

[noreply] Revert "[BEAM-9196] Update testcontainers to 1.15.0-rc2 (#13031)"


------------------------------------------
[...truncated 284.12 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 10, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 10, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 10, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 10, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-71m53Ym_6eKsaJNnatrbSLoCmDuUELBw17hQUj-ETKc.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-UptCMB8fiPxVDqDRiE2AvCb679GUrzh0qiPlQKLTVGM.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4218191528973871932.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-TXSQwinpweZMqzfvUcFPBLDruww6kbSjIrtyrZs4djE.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-1jGhOwd4pWuKopWoKlojms_rKYHNn4KgLEIHDIXrn9w.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Ny5b5dBz3dysv4cOJRzGdrtpNQqMSU9iXC1P68WGIR4.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-TkjtFf8MSl0elng12GmKLdpi0yyCX0iVrwVV-zH5TYg.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-26RkxxQAh2v-ij7DB9EJonTXCvfF5PxeSX0uf9zqq5c.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-71m53Ym_6eKsaJNnatrbSLoCmDuUELBw17hQUj-ETKc.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-BNjB7nreYhJuVdUz5FwtSWJ7qw9wJzMpMcc7Bkeisg8.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-vXXupr78wBM0hbLu6feHoKSSNuZ0spSW8BYM6y7WzUM.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-WSgpUr7XEkcoBWzZSF9Cd7fDh0gFcM8_wo6bQb8vjJc.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-qEal16X-7mRFTzaUZz6T7TgpMYED0LbcWdHZAnVyoEU.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-hzZqTxYVLtgqTh9qjZN-DzNv1Y5PJiXbnYdBZR7VV3g.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-Mo-TlgSfBGEGymwqRTnuOr5WGZQQbDq9bktoq6xybAI.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-UQ1YvuvWtxkJoojnWql2B6mg1w1T8h0J9UhSAbHGoRc.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-vqWQ32QlCl6vJ4CfWBRZK_XlHjrGd_MWRluNYEKWF7E.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-luz_u3dRqm6MakB_O8Cjkrpb54jA6V4ntbWo8s6UE3c.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-08Z2UdxIHWsu7ejj3rRcphmjQS5EQq6TXimpHX18yU4.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-TkYiBeHMc_kMsu2lkrmBkyM9PLHQiKzTU8y9U84XXEA.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-4GNPYaFRd9ov3lk9kPpQlDtJgYciAvwEWRUjg-1gqk8.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-7zy4Nsas_-7dNFSGXDoOVo7DfJgCCOeigQVF_2RajFg.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-F9gn13mYqZyZMwNupyDU7VLELPrT53wqsz8Wnwx5WLg.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT--hQEH842cBAa6zzroKqUPXkhugOiMjITVMBqnqhh8bc.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-ShvXpPv2Q8FBg0qrIfO-r1Uy3vgQl7WSi54IplaWjDI.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-jA0hI7sQti46K5bJ7e70UemHnOuP4gBefAVruqRJPuQ.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-FPcUjUzpsdUnp6Hb14N1ioR0zfD_-HMrN2QM5yJ33vU.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-v8YSlBHgwJl2A9t2W7UpK2AL3JSeJeE2Rl-MXmfvDvc.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-pdD3YDBH9HaSJYiQQH88mu7qGHK-g88XMNV60LlUJf4.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-IhNFTcAsNpqP5iEGTs3bFDNHs2hG6Lo04FrOQ1PG5U8.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-qIvCI7f6YNYZKWFpFZlR_CwboIYt3tMoJ-J3RnzcJ-g.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-xZEYhWg598lN9aOpyQTlKkII_acaDE_-QV3Ky0lF-EQ.jar
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 10, 2020 12:45:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 10, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 10, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 10, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 10, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96274 bytes, hash 3c8519222e1875272f27ea3433362e22a2cf70be3f8a4a0feaf51a7bd334880b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-PIUZIi4YdScvJ-o0MzYuIqLPcL4_ikoP6vUae9M0iAs.pb
    Oct 10, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 10, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-09_17_45_42-729450764474951927?project=apache-beam-testing
    Oct 10, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-09_17_45_42-729450764474951927
    Oct 10, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-09_17_45_42-729450764474951927
    Oct 10, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-10T00:45:42.522Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 10, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:45:51.256Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 10, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:45:51.993Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 10, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:45:52.021Z: Expanding GroupByKey operations into optimizable parts.
    Oct 10, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:45:52.046Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 10, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:45:52.128Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 10, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:45:52.158Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 10, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:45:52.179Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 10, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:45:52.204Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 10, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:45:52.588Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 10, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:45:52.678Z: Starting 5 workers in us-central1-c...
    Oct 10, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:46:11.874Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 10, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:46:20.983Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 10, 2020 12:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:46:44.467Z: Workers have started successfully.
    Oct 10, 2020 12:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:46:44.494Z: Workers have started successfully.
    Oct 10, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:47:17.111Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 10, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:47:17.253Z: Cleaning up.
    Oct 10, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:47:17.335Z: Stopping worker pool...
    Oct 10, 2020 12:48:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:48:05.807Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 10, 2020 12:48:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-10T00:48:05.850Z: Worker pool stopped.
    Oct 10, 2020 12:48:13 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-09_17_45_42-729450764474951927 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2c9ac5c0-d54c-42a5-a607-b12ec7a2dfdf and timestamp: 2020-10-10T00:48:13.426000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.706

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 10, 2020 12:48:13 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 44.277 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 55s
107 actionable tasks: 76 executed, 31 from cache

Publishing build scan...
https://gradle.com/s/j323sjkz3fawe

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1095

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1095/display/redirect?page=changes>

Changes:

[piotr.szuberski] Update proton-j 0.13.1 -> 0.16.0, update activemq 5.13.1 -> 5.14.5

[Ismaël Mejía] Enable UsesStrictTimerOrdering ValidatesRunner tests on Spark Classic

[Robert Bradshaw] [BEAM-10988] Partition dataframes according to size estimates.

[Robert Bradshaw] [BEAM-10988] Batch dataframes across partitions on the same worker.

[piotr.szuberski] [BEAM-5589 BEAM-5590 BEAM-5591] Update Solr dependencies from 5.5.4 to

[noreply] [BEAM-9196] Update testcontainers to 1.15.0-rc2 (#13031)

[Kenneth Knowles] Add gradle target for ValidatesRunner against Dataflow forcing streaming

[Kenneth Knowles] Add Jenkins job for ValidatesRunner against Dataflow forcing streaming

[noreply] Delete unneeded PCollections in pipeline_from_stages() (#13014)


------------------------------------------
[...truncated 282.68 KB...]
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 09, 2020 6:45:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 09, 2020 6:45:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 6:45:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 09, 2020 6:45:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 09, 2020 6:45:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 09, 2020 6:45:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 6:45:43 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 09, 2020 6:45:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 09, 2020 6:45:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 6:45:43 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 09, 2020 6:45:43 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 09, 2020 6:45:43 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 09, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 218 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Us-xILbMM9VUcWPDpdvP3YbeTISv94wSan2fL9qom6A.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-XqmAE21WwHB3D_ORfHY7EIXCsOo7ErODyEQuHrw1X7k.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-rmFkZrr_6paOo75zZ9wJlDBBYGg7BXkQa9KsXbOH9hQ.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-54TbF0hQqfQPnPG8gaAAj-hEZgsEtN2BzFLvNXevvJU.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-9EiyXlFZYDS5-X9PjOyNwLw1QgyCsYGw_4c1YzbuLo4.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-_SVLF11Vfqib6PkK6u7vFdd6z4KANxqR_b41cINZJgM.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-Hv5NVPYQj2s0G9UnVVcr_uCWewVSYQsWz8srPyM_qG4.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-ykkuWNh2WRRRpvyirPWs1tchg-p6kKnHGDF8B-5qAtQ.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-T2xkkND6i_5sb4fz-86eQ1vslyC9xkxnR0bfCgqUgnk.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-f1xmuA0OA_Wmc_kP7YxnR4wVIrucWa-Fzfg7VbtGYTo.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-MuBfPaL7eiEccgD6VmdkBmelDPdAlRNXkkpLRI9cs3s.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-opDUdWowEQa5UT9zNDW6gKCrn7ql2HRwaqe4ckX5j-0.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-ysMXXaoxna7gWkk89yNOyHB5Sglxc4FjKlCrLGcJHXo.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-lzUyqWWSq8nI34asbQobw6gI_ikKzpxPI9CUOgYrqJ4.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-wO_y_G0gUjaCVs414c97CerJMbVL24njxVBkkNMFZ2o.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8125207470795486483.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test--A49UZv4FQaVwQJUuynAc0A4VWaxD83TYSwqb6UVSzA.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-fGyGVz2j9KYqBL6x7GMuvUNwuMuGzdxaTLfwxKWdgzM.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Us-xILbMM9VUcWPDpdvP3YbeTISv94wSan2fL9qom6A.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-sR1Re_5sOCopmiNlrNVgnDakEMNwCikAaigCq8KHmNY.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-cLdMXmWRd304SHU1uOThf2JB7uiel6YV_9HRJ_WOJwU.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-jGYAdlkfhRiprMLPzNQJdyROapryo-j_Neyj0YoBsFM.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-lRfabcWv9vAd51uR_HXPX8qo1Ip-foazve_p3OnvA7Y.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-GWYfg438khdAeEzOtH4l-iRx8iDNNQ9cu2hC0OQWr2Q.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-HSw85ILXQ3nmaOUJ3KvaAPSJFvQRU7PQWbjGOZS-8ok.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-WBgH_kYarbUZNQcwYqnuT7jPIf1iKJdRVPYU6L7rTUw.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-DxQRcmLE0vDQbSEmuKRmmVM3d5bFYJOo59XFgU2hink.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-PMlNGRz97WnridyEwquRpwUVRrklZLZD7gz5VRaBFEQ.jar
    Oct 09, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-P0_5ZV-y1lJxGGsUYl5NJqyArQZQZCjDwZbI15WLpBQ.jar
    Oct 09, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/kafka/1.15.0-rc2/47ff228024171dc5e0d938bc3221fae2878c1473/kafka-1.15.0-rc2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/kafka-1.15.0-rc2-bqsYOX1YvlCvptLLM_-rMZ--vSLZslIdDL-9ioqzWBU.jar
    Oct 09, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/testcontainers/1.15.0-rc2/f818b00ee47532c38b418e677a818ad4dca19db6/testcontainers-1.15.0-rc2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/testcontainers-1.15.0-rc2-avdkCap6A5JPV3hAULK21O7FqzEYvsRL8Ui8VniQ_Ug.jar
    Oct 09, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-37nZyXGDI-oE3RvQN2rKnrALQiTcL3wyh1I1P8XC4VA.jar
    Oct 09, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport-zerodep/3.2.5/7fca5e1998c889d486b01695f098ecf760a4ac90/docker-java-transport-zerodep-3.2.5.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-zerodep-3.2.5-OwcPjMMl-Jjxn-QgBEUudex_Jf9MIbZiSBhFLY5f9vY.jar
    Oct 09, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-zPCMX9rM_fYS8uk8fPZBtbyCjHY3_p5jpvE9djQWhRY.jar
    Oct 09, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-dHAgOCBjLI8MiLTKwvXQ2Z2AQ_a6uEU3KWREpVOK5pg.jar
    Oct 09, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-api/3.2.5/8fe5c5e39f940ce58620e77cedc0a2a52d76f9d8/docker-java-api-3.2.5.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-api-3.2.5-eRxdklIyFsCE2_dXBCj1039ReXyYGkafqCMwJZWXx1I.jar
    Oct 09, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/4.0.0/deb6bf66918989b50209b8c9aaf3b2561af7f011/jna-platform-4.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-4.0.0-B21i7Yfna9yzdQ_-gKpJXuIRK9chJmOVTWVH9IJEkuk.jar
    Oct 09, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport/3.2.5/27af0ee7ebc2f5672e23ea64769497b5d55ce3ac/docker-java-transport-3.2.5.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-3.2.5-QA3UXPFvvCRXrSNnNwbFWAm4P7tT_L9aCJHzxaijRO8.jar
    Oct 09, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 182 files cached, 36 files newly uploaded in 1 seconds
    Oct 09, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 09, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 09, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 09, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 09, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 09, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95586 bytes, hash 14c18204a58bb95109c0ccee6338f2b8c7c0e0b57f6d02f33197a16d337aad29> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-FMGCBKWLuVEJwMzuYzjyuMfA4LV_bQLzMZehbTN6rSk.pb
    Oct 09, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 09, 2020 6:45:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-09_11_45_48-14848730542711719201?project=apache-beam-testing
    Oct 09, 2020 6:45:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-09_11_45_48-14848730542711719201
    Oct 09, 2020 6:45:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-09_11_45_48-14848730542711719201
    Oct 09, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-09T18:45:48.831Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 09, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:46:00.044Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 09, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:46:00.688Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 09, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:46:00.718Z: Expanding GroupByKey operations into optimizable parts.
    Oct 09, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:46:00.789Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 09, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:46:00.866Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 09, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:46:00.922Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 09, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:46:00.956Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 09, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:46:00.981Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 09, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:46:01.361Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 09, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:46:01.442Z: Starting 5 workers in us-central1-c...
    Oct 09, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:46:08.496Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 09, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:46:25.432Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 09, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:46:56.294Z: Workers have started successfully.
    Oct 09, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:46:56.327Z: Workers have started successfully.
    Oct 09, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:47:28.517Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 09, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:47:28.700Z: Cleaning up.
    Oct 09, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:47:28.788Z: Stopping worker pool...
    Oct 09, 2020 6:48:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:48:20.047Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 09, 2020 6:48:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T18:48:20.112Z: Worker pool stopped.
    Oct 09, 2020 6:48:26 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-09_11_45_48-14848730542711719201 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4dda9aba-72e0-4382-851f-6b400dc35d98 and timestamp: 2020-10-09T18:48:26.205000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     13.26

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 09, 2020 6:48:26 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 52.09 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 10s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/43gy66p2s7qko

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1094

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1094/display/redirect>

Changes:


------------------------------------------
[...truncated 270.62 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 09, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 09, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 09, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 09, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 09, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 09, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 09, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 09, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 09, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 09, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 09, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 09, 2020 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-eHHzPB5G05VsgKp7dhMl0Hcp0fmTE_4dAe6Q8OzrNV4.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-tXRlkJoIYshG4lOEHfIQW8udDybiOdwI-spUL4nqk3k.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-TLXbJr7b07GfNGMM_xU2TKK5FjjOw3HChQ49fye4Af4.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-qBMVtIufGsl-r2eju1e4lrcZn9M5f2W_4KEAmNQP7SU.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-xFknPSfniLOnbBt1lb4t9vS98-B9_TwymiYOTdpmGa8.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-7h4LM6dtaNtpwtYSFREt5kdMrIhu6ZBOuCu5uWR2O7g.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-K7FjxOdlmSTZKqYrLjJszlGyT-ODxndf_kCzeiocbKg.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-HW0AScxT8ERzMby0g_FcNMeXQmxQq5qOOvl-haccFt0.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-9qk6mKgePRFOgvqoxGSNke9cUa4PqJkAibfEAuL9NpI.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-tQwNipL_lf8ggGXn9zqV8thO-iV8L6hqUz9p1I5R1A4.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-A7RJZaHdHSTJA1m5M3cObxPzmWGOJoBVInMqATb5GKo.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-60sneUaJ9R-goOS7Rjl1YT_9nI8vj9yn59kHTBdTt0k.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-c3X8kJOrrHoGlZ0se7LZUumUGiVk3xWenkrZzDmqGEg.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2701427447633812647.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-TzjXu1L03RoWJUYXVqdPiLQJ6Mb5JjHmW4Bn_TTCP2o.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-TLXbJr7b07GfNGMM_xU2TKK5FjjOw3HChQ49fye4Af4.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-z7o1jHVeeecIsOeHn39ycEZN2Jc-7w4ntK3cMBviYlw.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-v-vESSj0zj7UsnKxYQ_x4dxX_UBIluY9BQ_GUuyq6Z8.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-B6NBQWfnseNZdgWRQYj-uKqdznSjxjjfLjk68mJgOew.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-E6muuWHQZBYtIWosaoLgL4Vwi8FqVbJ7-RA5fwDwH1Y.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-tsgm8r7c-IWpx8mV0TBRsK57d85ual3hZsjA0MLtNsE.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-a1bm2EVaMQ-J04pN6XoIYv2WPa1d-UqSiudEFT6q5Lw.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-i81KmexfLrilX4YRQdvTp6H2vjH8k-ZEiUMBkBX78eI.jar
    Oct 09, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-6tpflcV3omU0toE44S0bYfvodPTpHWJ5iGJ6OdqIejk.jar
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-uSh6hNtucDmnTigF9UHjxjxi_KOrYhwPIMo0mvtkv9c.jar
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-1Jy-dwW4MjYVZcNqn5QeXSwfjew3qKVbV66fAJOBMuU.jar
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-PyJI90Ieizf1Nk1ddzRAik1pyzzdHjlHgUaS4YjVjyw.jar
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-eiW-qso2jb-zJYoez1ExrxU712LF3d4m19Yz0eYAmqM.jar
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded--T6bbNkep1KW5702kmVHgaR1iWLjxpjsP2PBZ3e-MJs.jar
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-kidTvM0V0NawudeAgtpSGXvT-Vm2Kn1Hy4iPF_RNcKg.jar
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-QbOkv35D79pHrLsSi55Xfk9QK2cZsauOkw_Guyg4QlY.jar
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-lQBcbKSy3w5bCww-VwchFT_OweSycpTv3wzIg3_5tWQ.jar
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 09, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96276 bytes, hash 0b6b9ad92d2ff71709d84da7d49947e3c4a8e0bbd1c25a3834860d8b2f30dc68> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-C2ua2S0v9xcJ2E2n1JlH48So4LvRwlo4NIYNiy8w3Gg.pb
    Oct 09, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 09, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-09_05_45_16-17327153574229360426?project=apache-beam-testing
    Oct 09, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-09_05_45_16-17327153574229360426
    Oct 09, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-09_05_45_16-17327153574229360426
    Oct 09, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-09T12:45:16.277Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 09, 2020 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:45:27.133Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 09, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:45:27.771Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 09, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:45:27.808Z: Expanding GroupByKey operations into optimizable parts.
    Oct 09, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:45:27.847Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 09, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:45:27.927Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 09, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:45:27.954Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 09, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:45:27.999Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 09, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:45:28.035Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 09, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:45:28.521Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 09, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:45:28.595Z: Starting 5 workers in us-central1-c...
    Oct 09, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:45:53.687Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 09, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:46:01.872Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 09, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:46:15.123Z: Workers have started successfully.
    Oct 09, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:46:15.170Z: Workers have started successfully.
    Oct 09, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:46:49.043Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 09, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:46:49.195Z: Cleaning up.
    Oct 09, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:46:49.277Z: Stopping worker pool...
    Oct 09, 2020 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:47:33.626Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 09, 2020 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T12:47:33.669Z: Worker pool stopped.
    Oct 09, 2020 12:47:40 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-09_05_45_16-17327153574229360426 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 34dc7fa4-9752-4a25-883e-09cf18dbd60a and timestamp: 2020-10-09T12:47:40.801000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.199

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 09, 2020 12:47:41 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.042 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.057 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 38.244 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 26s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/hohhimrhnd2r2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1093

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1093/display/redirect>

Changes:


------------------------------------------
[...truncated 281.01 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 09, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 09, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-vBGmFoCNyZt6jWgfHREyq_Bq7UD7JdBcyG98RM0ACZM.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-LoG2ZK1kV2xffpJ4DyYONHrRa_o-aZ1qY0AWrh3SPd8.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Zx0ck0b_UsPYXBQbsvU1x-ckDWVQLXEV9NLBTNIaZnk.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-TGuecsWj6TV5SnUGjkuXUk5Pg5c2UcFadqW7uT-Y6Sw.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-CLTP1ivabxSCFMmKjGopU5Yym6xay83OX2148ynRSMM.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-3yDVme88gGpreSngPCen_9sIhPqpJAvtPqlrc5_38Og.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-UaVKbkQy4fSmKfGev8OsrHsjqYrvZAnW4SqOdMqtSeQ.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-zCrGWvivX0I40d1u-DtmeT9i4i0YmdldCLLqQ0wyhBo.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Zx0ck0b_UsPYXBQbsvU1x-ckDWVQLXEV9NLBTNIaZnk.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-4DQTSYxqHDFyunB4T7kCMaUej9Qdo-DtbYa-1SBfgg8.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-atA6mZaait8hdjHUmX4KxVoT8bH6eVW4v25q-DGtJSA.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-XNDvJHNhiGV3qTUbfrWyxlrcewSaaccR8XvSH2StXyE.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-cpb3jFqMvQp3CCU6cnpKqxEvo3N8ZwGlFRmbAAX4Xyo.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-ZfG2ok3j0cyw5bTNjlW57G5mpqCUAXuy_4kzHGho18c.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-8WDHQd1w5fnQeQQspA2Lud4cenXIOjqj7WNkDqMzZIY.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-BkYi1TUCeC1Lkl-dEkIfqiY4NqMC13wYkTt94_3ac5M.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-K7JnnCTrkC7hIlr6nI4i1EkaEGSxsq9YytWHPtG7f94.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests--4GEmNeBqhs1ohM6X4W9S72A1S4r8louX__7jUbMzOo.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-y4gnvqpN7i2Qf7rJRjdOcSuseJNu3RZyROra76bALHA.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1817817399932436775.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-mXdU95Kfosr1KhyRJnkx2A7ROHDYPo5XKXdpA6fxB-0.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-jX5aNC3f_QdFU9V_LcZVuyphl5aHKdneW9U1_4YK5R4.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-x9ovrzbB76uAAEfUynzH9sAFSSVgHZlGKwQdoF0qt68.jar
    Oct 09, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-iZPCGsntQH739CJw5I0nZIksj9bkKqtBaZwEh_GacsY.jar
    Oct 09, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-gfcVUd9Ug-VUo_U2o1Ocryr150qHTj28PtDqze-WnqM.jar
    Oct 09, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-0xUaKv0mwQiUVjuhRm8eChW_bwdWvQ2i0bKYKlBzJvQ.jar
    Oct 09, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-8Pb0ul0Fb2eBFqPScbtaM1-xKenvvqvYooMtTtF3j5U.jar
    Oct 09, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-6Jurl6X-OZ78V4iL-pWQSMAmGZhotrrQPPyxHk83D04.jar
    Oct 09, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-VtyNs4o2xXM9355rHNqNSa28WkVeoPSfG_PgCE1fxow.jar
    Oct 09, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-H4bSRWBdZOPzjBSmRfeaAcMoDBzOdWnpWmpWzeSr95s.jar
    Oct 09, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-KOj0kfi0ZjLncFsp5V1I8voNGwHuT2PhzHndxUVnz4s.jar
    Oct 09, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-8LxcOK7KG3829Q6HmiZREH80lKZx6uR81HM_je-Vjq4.jar
    Oct 09, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 09, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 09, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 09, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 09, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 09, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 09, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96276 bytes, hash b92cbb81b5869f7352f330a47174ac331b63806551137b391431c401755b9e25> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-uSy7gbWGn3NS8zCkcXSsMxtjgGVRE3s5FDHEAXVbniU.pb
    Oct 09, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-08_23_45_37-10770513668651889520?project=apache-beam-testing
    Oct 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-08_23_45_37-10770513668651889520
    Oct 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-08_23_45_37-10770513668651889520
    Oct 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-09T06:45:37.440Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 09, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:45:45.493Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 09, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:45:46.238Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 09, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:45:46.290Z: Expanding GroupByKey operations into optimizable parts.
    Oct 09, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:45:46.326Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 09, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:45:46.408Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 09, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:45:46.437Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 09, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:45:46.471Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 09, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:45:46.495Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 09, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:45:46.979Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 09, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:45:47.064Z: Starting 5 workers in us-central1-c...
    Oct 09, 2020 6:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:46:02.719Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 09, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:46:12.757Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 09, 2020 6:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:46:40.065Z: Workers have started successfully.
    Oct 09, 2020 6:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:46:40.104Z: Workers have started successfully.
    Oct 09, 2020 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:47:13.835Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 09, 2020 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:47:13.962Z: Cleaning up.
    Oct 09, 2020 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:47:14.060Z: Stopping worker pool...
    Oct 09, 2020 6:48:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:48:08.097Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 09, 2020 6:48:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T06:48:08.132Z: Worker pool stopped.
    Oct 09, 2020 6:48:14 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-08_23_45_37-10770513668651889520 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 135d8828-5c07-477f-bd52-7e17851c751b and timestamp: 2020-10-09T06:48:14.228000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.735

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 09, 2020 6:48:14 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 50.866 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 59s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/4rdankij2pb52

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1092

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1092/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-10670] Make Spark by default execute Read.Bounded using

[Kenneth Knowles] Fail task when local job service fails to start

[Kenneth Knowles] Create parent directory tree for local job service pid, port, stdout,

[ningk] [BEAM-11039] Resolve conflicts between TFMA and Facets imports

[Boyuan Zhang] [BEAM-11022] Support bundle finalization for portable Flink in

[noreply] [BEAM-10845] Add Union support to AvroUtils (#12746)

[noreply] Minor programming guide improvements (#12953)

[Ismaël Mejía] Enable SDF testPairWithIndexWindowedTimestampedBounded in Spark 


------------------------------------------
[...truncated 286.54 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 09, 2020 12:45:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 09, 2020 12:45:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 12:45:53 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 09, 2020 12:45:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 09, 2020 12:45:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 12:45:53 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 09, 2020 12:45:53 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 09, 2020 12:45:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 09, 2020 12:45:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 09, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 09, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 09, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 09, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 09, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 09, 2020 12:45:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-YiaZYrirH1QbjhsIyPn_O8DTZ9cDT62E7sXQtASemUI.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-eOEruW5yPlC89Mmq9f7TIXE5EJux2Pyi9mGDCmQk4YU.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-2gvP5FCb3GW918ff5I7y3-4rB4G3w0W9s3Z3vpdfRxQ.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-CfAB28RfzB27OMV5nwM_Z8L6O9qZzRsqanuU-y4CM80.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-F_Be9QYyXotmdRDZvM0k8AFVgcw2w2xn4FzW4IDmOKg.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-2BkZz0z6xxi2mS9Ld5vxFWtbUllbsThXOxQTB4AzAAo.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-fMmkPOps7L2VNarf5ROxh9kGE9IrEp8VVzX5mQfT3QY.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-yt9nbbbCXr7HB_m9yqpCkxSUXpFpeZmiO6k_-P3vrMw.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5803676578695937006.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-PdW7ipb5KTgsFxB3iRXHilAkpKmr_uj7yw-ADSbIqV0.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-rKbjEybvSwTXNgdrKw0FiydawjXGAKTZRVUUGpxMUo0.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-uYri-5ZaFg8N2SoeF1HbCsLuz8TrT8924cl_qgzQBtE.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-T5BQDlSIaRsYYUlZ0bLt0t-yTwanEtdH1dRjVpYp1Fw.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-P5Yfb2yrCx9BQ9tebxKIm3cSjw1qjrxYIk0yMm7xc-I.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-nP1mIC-vSZJKXmDsb4ERYzhyW6S8-s07GGa-oetZjkw.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-w5o1NT9jVs6PZYRYBqrpPsj5y01QD6Xdwggh8JnH9kM.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-YvbOc9noDPQ4_Iu42XfNB1baHKPEFwhHas6gD_gvM8E.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-7N5BSVxvW9CCK4yqMl1JUHdRBl6qrB9oQbVn-994_V0.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-gzYpJOt0ql_ewpnEhfpsW7L-2462WzV82CuX2cyzJMw.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-k3yJLEP5C7bA9wW1LpX-15xwsNq2IAMzigdiC4WIuIU.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-YiaZYrirH1QbjhsIyPn_O8DTZ9cDT62E7sXQtASemUI.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-O2eXHRgtDK-zwInEDLmjeCZ9U2kVwig2Q79dqWDX4p0.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-X-sALBMsMFCX5szt_3n-4g0sWsOjYDvcvHoHSar_3vo.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-qbf_8bs-EzU7i7srCd5U9F6tWlO0TTvoEB-vbFQSavY.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-MmZUkqJX7D1js5dAVuVZ_65YovwwRm9zcZHuS-aVmJ0.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-tSsCVNgyz6gJbS-tk9AKhbfETsBJYWbFO3DX4u9MGBc.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-ug0MaRFh_xZL4v3vCDpJwQRwtvcbGfF27kLQ3G8V7Es.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-1-xu5R8KiKTwcFc8HBlnh3RIcLS5gI4pLfPMK_8DTc4.jar
    Oct 09, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-D7QwD7wDVP-F6vriFAmZ_6lHrT9mMIjdQfZiwjjxMys.jar
    Oct 09, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-UBTiIT_d13hzX2BPXc2HvJRHzq_oZdZf9aTTVQ9dxXA.jar
    Oct 09, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-mdbV0COPOsNkNOaad1Y1_k-uZg8zT17dZ0bK-tVN8fU.jar
    Oct 09, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-OXmEfM91jYt_lhGXv9rV5MdmuBsz3iChwh7pAZ5Kfv4.jar
    Oct 09, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 09, 2020 12:45:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 09, 2020 12:45:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 09, 2020 12:45:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 09, 2020 12:45:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 09, 2020 12:45:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 09, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96274 bytes, hash b36c3f409a7cae697486a714a922641cce5deca4768925a64e987dc26e6a6295> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-s2w_QJp8rml0hqcUqSJkHM5d7KR2iSWmTph9wm5qYpU.pb
    Oct 09, 2020 12:45:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 09, 2020 12:46:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-08_17_45_59-13129324702395228560?project=apache-beam-testing
    Oct 09, 2020 12:46:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-08_17_45_59-13129324702395228560
    Oct 09, 2020 12:46:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-08_17_45_59-13129324702395228560
    Oct 09, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-09T00:45:59.897Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 09, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:46:07.824Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 09, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:46:09.122Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 09, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:46:09.166Z: Expanding GroupByKey operations into optimizable parts.
    Oct 09, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:46:09.193Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 09, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:46:09.278Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 09, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:46:09.304Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 09, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:46:09.336Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 09, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:46:09.368Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 09, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:46:09.836Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 09, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:46:09.913Z: Starting 5 workers in us-central1-c...
    Oct 09, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:46:16.679Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 09, 2020 12:46:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:46:37.571Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 09, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:46:56.539Z: Workers have started successfully.
    Oct 09, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:46:56.574Z: Workers have started successfully.
    Oct 09, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:47:33.141Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 09, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:47:33.297Z: Cleaning up.
    Oct 09, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:47:33.369Z: Stopping worker pool...
    Oct 09, 2020 12:48:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:48:23.647Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 09, 2020 12:48:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-09T00:48:23.692Z: Worker pool stopped.
    Oct 09, 2020 12:48:28 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-08_17_45_59-13129324702395228560 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): da776384-10ef-46d4-9ff9-a3db7ea684a4 and timestamp: 2020-10-09T00:48:29.010000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.768

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 09, 2020 12:48:29 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 5 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 44.56 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 12s
107 actionable tasks: 78 executed, 29 from cache

Publishing build scan...
https://gradle.com/s/i4so47i3nvtf4

Stopped 4 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1091

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1091/display/redirect?page=changes>

Changes:

[tudor.marian] Add support for gracefully aborting workers. On complete_work_status

[tudor.marian] .

[tudor.marian] .

[tudor.marian] spotlessApply

[tudor.marian] .

[tudor.marian] Added async abort signaling up to the GroupingShuffleReader's (values)

[tudor.marian] Update abort message and add async abort signaling up to the

[tudor.marian] When aborting the work item, log the status received from the backend

[noreply] Use more specific version of Python in Dockerfiles (#13047)

[noreply] [BEAM-8017] Plumb errors and remove panics from package graphx (#13028)


------------------------------------------
[...truncated 271.81 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 08, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 08, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 08, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 08, 2020 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 08, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 08, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 08, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 08, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 08, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 08, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 08, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 08, 2020 6:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-kgjBQ3VNSp2zYs3-TuZ9LQC9VBUbCXACGK0iWKMDugM.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7040709341774927279.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-R7VFMeM27nQe4EMAeZ4YeUybTwjOXUWcs80k2lL5cZE.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-a5KczLjGYKTHdCKte0FnOb0ZONCkeViAzwtR3Gs6leQ.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-7-4rN5m7b5f0vOZbDVongB7efDEPmw0sY7R78ATJLuM.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-Jx0FrrsTcdgUfbHxBo4pX05Em6w0xpJF_i1gJZXK1vc.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-iZR1hO0eS9-CdP143Ox7UU45TOTdm6t7rYzCu9QTSZ0.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-4RmOo9NjFmm9dGYfJyHquW90rGkJEHeJXnfD54MO9YY.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-CY6JnN9ICFg3aCQOh3yN_5m8c6achDXDK6Eh0aQc-Mw.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-KkMO2YSxL_Kap6ih3H183Mqicu3YJ5szQm_54VHapvE.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT--AIXu6D0HUpnUOzyEuWRDDRhY1xEIz_PQbtBX-f4J0M.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-gJiG5T4wCssNTV6aWPnBDcNAUauKLjeGc3HyDhqgjUk.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-055a8of5iS6zAHd174pYxNC66xuXWnzGPsvCM_lw87s.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-fQgyHAKjDhAz0WVZe45rIfroNW6Xg0GZlFQrbQItsM4.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-UotlESf5PWr1KUJEwZldS8K61AypMXKx5fg24rWEBKo.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-tP5Bvx54iASnTFJvzLgILpvW31JuCeR7LLHuhaKN3Nc.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-TrdXnYXcKopKSwGQcs5uO41G19WbLis2AyldiGdEi0Q.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-9yR537Z3_N4L5D4VWW1Bxw99aMkI_nVUAkAoDA6QQp0.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-FazMQ_enN_xQcAtHpyHsDznZ2XPMPF9v1XCrDyuYxh4.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-bjUFI3CedhBuD9wImZFTIxGggv5z0jV52QZgtJ9hTMs.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-fLhcBftp-jsF4kNSfYD94DjaMv7XeqjY_qqLhKN69Fs.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-pLSX-zGRK_yHLku3Wd0s0LicditixO5izX67snzOeLo.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-EIiPwPk28t2n84kMZ8XRGnWaZvb9XY8WiSMMujdgJnY.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-4zCFLpnxl9pH6tkv5VAceI35zL_ePKqJGn0wytYLLx8.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-pHAXs1IRrcau1YyQx1qrywmrpplrNezOZLO7VAIBb1k.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-kgjBQ3VNSp2zYs3-TuZ9LQC9VBUbCXACGK0iWKMDugM.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-6ziF4CkDObHF-10L2q2nSG4dRpxIYCiU2hkHYRg9RaA.jar
    Oct 08, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-ooazppGeSOUazJ31fejaggT3mLdJBNasiZRa06joEBM.jar
    Oct 08, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-ObRUKr8Wqz4pju7L66cLa2IMxwIYCrb-lsBSKrXAW_o.jar
    Oct 08, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-TXO39jS_8u_0WMpPq0cVuaG5TnuPV-5-x0bvSet-G5A.jar
    Oct 08, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-aMe4YJIwGbGaqIpIpRoB9JoEPk5OIv7VKdLXK1kkY3c.jar
    Oct 08, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-Zn_2aArDqJdkDRpyr9-r5K9-Y3m-jiox9-E49u0gHm8.jar
    Oct 08, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 08, 2020 6:45:34 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 08, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 08, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 08, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 08, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 08, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96275 bytes, hash 9fe336d15ed8db2c46f885255b1a042b7505f9f08faed8d4b9520f611cf9b1b0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-n-M20V7Y2yxG-IUlWxoEK3UF-fCPrtjUuVIPYRz5sbA.pb
    Oct 08, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 08, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-08_11_45_35-2657468249763269355?project=apache-beam-testing
    Oct 08, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-08_11_45_35-2657468249763269355
    Oct 08, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-08_11_45_35-2657468249763269355
    Oct 08, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-08T18:45:35.374Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 08, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:46:05.872Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 08, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:46:07.239Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 08, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:46:08.209Z: Expanding GroupByKey operations into optimizable parts.
    Oct 08, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:46:08.270Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 08, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:46:08.653Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 08, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:46:08.818Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 08, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:46:08.900Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 08, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:46:09.056Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 08, 2020 6:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:46:14.820Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 08, 2020 6:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:46:15.369Z: Starting 5 workers in us-central1-c...
    Oct 08, 2020 6:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:46:20.310Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 08, 2020 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:46:52.866Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 08, 2020 6:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:47:05.664Z: Workers have started successfully.
    Oct 08, 2020 6:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:47:08.057Z: Workers have started successfully.
    Oct 08, 2020 6:47:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:47:44.339Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 08, 2020 6:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:47:48.187Z: Cleaning up.
    Oct 08, 2020 6:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:47:48.424Z: Stopping worker pool...
    Oct 08, 2020 6:48:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:48:39.059Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 08, 2020 6:48:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T18:48:39.604Z: Worker pool stopped.
    Oct 08, 2020 6:48:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-08_11_45_35-2657468249763269355 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8475c4bb-ca22-43ba-a481-2e0723ceaac6 and timestamp: 2020-10-08T18:48:49.990000000Z:
                     Metric:                    Value:
                   read_time                    15.849
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 08, 2020 6:48:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 3 mins 29.048 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 35s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/at5iwcghglwvg

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1090

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1090/display/redirect>

Changes:


------------------------------------------
[...truncated 272.24 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 08, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 08, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 08, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 08, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 08, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 08, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 08, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 08, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 08, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 08, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 08, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 08, 2020 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-8xP1UhgogxUFFlVnu7cO0zrTQQFwkp9V9diXBxvSvvo.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8416488372988475559.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-hh6S0D-x0xbihtx22oHgmtg_j7EuTBStijYb35FV5nI.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-38oibo2USbjsQUyzeWQoBvpXO2z9bmeZLmZWcidmqXA.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-vB_-SFE6xdgEF9OPsi7nm5Gu75lPvnTCHWbxWqYZG9s.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-8xP1UhgogxUFFlVnu7cO0zrTQQFwkp9V9diXBxvSvvo.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-7sG02xzc95fQtnQYDR7_1zjwyOGsQQARKKOml2G9hNY.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-R0bIcPbAD_I7xaGpEPVsKLTmPyTPzELQSm4j3A2Mm4w.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-4D5hRUbIU9BgOv_Dlu3E76L2Q6mHUHOy3c7XHsIKclE.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-s27Jbom2pk6oxdNVUkWMCY_VYjUsIJAjuXpAudJTB9g.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-q4_W4UMoq7nu7OdT-NIx7ymHLx6rjtlhfKIJ0OZSbHY.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-IJglzr_HeYh01rVBN1K8hfnDgA3_DWpc3Jlv7MduX50.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-y0HOv-t0Cs7OaDIZmyo6LokpPt8TmriGMMe99kZu_0o.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-iGCM9FhIpmpXesnfvH4evfINIWVcIwBMrdq5e4M4xaI.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Sa0viBzDKJkSl8RryvnqJ543gI8bjUws1liJL42BeP4.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-fDOGBA3lmeG5g_DKwSJnNCBLvRUIuvCOrimo8ehlvNA.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-NvSRZW_k2GLptIxjunwfmnKRGGhevsVZ70IXw7yXE10.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-l8MkmomegkeqAEMd5cbYf5hWZovvoOcP9qLoWhy5_PY.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-hiVtBMD4CPMgKqZsEyvVJRTlomuTuNfMWtdbAFU9MAI.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-SiUGia8sMvNcKPGDFixmWqkY6VOz58Ut6UM6FQZ4DUQ.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-DDxKsNHxpym0fU_2MDsmHTthmh0db2meFybxwJsdWGA.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-FKNP1-URpQrirGvtuXHuR07ZDmZ49dEV0zUnb1aDeNY.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-yRA2vdcN_iflmW09Xvmrsk2ivH60TZr7AY1vwihlWb8.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-XxKE4cmfM_xcwIwcmE9UIwx4Ov7g2qawxUUBENrfQkE.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-JSm4O2AFVJRZeXKhQZthZx2CZTDBhz6N0bGJ-eOjcfU.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-zKrlvu9ZVjS4FBQJ9OZOmv3qArqYNvEXlvSNoq720k0.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-tRC140H5oi3Tfa0nzMUjY6_fYZmjL4kJtGGS-YCfDac.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-rOAHJuOvCxxqePNKX8tgQjyWjibP-7YosjKudd10ZLo.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-YnL4nox1_uFvY7BKHA0knqfY3pgF5A2wgxkfGr6dwSk.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-GUmKrmnGu5_bTd6KM4hWlbzgb7OJZZjR7sdJUgq2nmg.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-L2gTgjwrg9468y79h57xQTp6GIP2JEYmLECatyVQHMM.jar
    Oct 08, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-QBM-sbChjS4dgkdWnUxa81dNqVY4gRYh8pdSkbmb9_A.jar
    Oct 08, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 08, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 08, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 08, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 08, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 08, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 08, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96274 bytes, hash d653d2c116393e8267d677f4af418b840ba28b7b64ae7d34b71f4d8630c41e79> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-1lPSwRY5PoJn1nf0r0GLhAuii3tkrn00tx9NhjDEHnk.pb
    Oct 08, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-08_05_45_15-11466901415265685930?project=apache-beam-testing
    Oct 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-08_05_45_15-11466901415265685930
    Oct 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-08_05_45_15-11466901415265685930
    Oct 08, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-08T12:45:15.678Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 08, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:45:28.381Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 08, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:45:29.125Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 08, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:45:29.221Z: Expanding GroupByKey operations into optimizable parts.
    Oct 08, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:45:29.271Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 08, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:45:29.487Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 08, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:45:29.551Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 08, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:45:29.615Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 08, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:45:29.681Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 08, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:45:30.894Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 08, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:45:31.013Z: Starting 5 workers in us-central1-c...
    Oct 08, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:45:50.382Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 08, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:45:54.314Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 08, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:46:14.243Z: Workers have started successfully.
    Oct 08, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:46:14.354Z: Workers have started successfully.
    Oct 08, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:46:54.322Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 08, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:46:54.486Z: Cleaning up.
    Oct 08, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:46:54.592Z: Stopping worker pool...
    Oct 08, 2020 12:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:47:47.167Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 08, 2020 12:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T12:47:47.244Z: Worker pool stopped.
    Oct 08, 2020 12:47:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-08_05_45_15-11466901415265685930 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1a89bf41-e024-43bd-ad88-205bc1fe32d0 and timestamp: 2020-10-08T12:47:53.294000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    19.278

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 08, 2020 12:47:53 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 50.747 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 37s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/utjswdb2d5mqk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1089

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1089/display/redirect?page=changes>

Changes:

[noreply] Update BigQueryAvroUtils.java

[noreply] allowing BOOL and BOOLEAN

[noreply] [BEAM-11019] Fixing Go SDF progress reporting with multiple windows.


------------------------------------------
[...truncated 272.22 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 08, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 08, 2020 6:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 08, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 08, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 08, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 08, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 08, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 08, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 08, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 08, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 08, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 08, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 08, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-rQyOEw9Z8Trq5qY4oQRIn-e5zmSC08Do2r7LcmSKh60.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-9rRTmtxEiAt-FfdAmzEgh0yMWik9U7nwmGHSV-lkhRU.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-rQyOEw9Z8Trq5qY4oQRIn-e5zmSC08Do2r7LcmSKh60.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-HaWBOVvxhWBKfIcsOZvSZyOkmWbOSKYb0SEoYcoHI_s.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-o0NJ6YBL79GKtVbYUQTqCROqSdu5GH-KeixhWYM-aqs.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-ZWPgdkwufo5CdUtyv9BZrL3QWePaPVzN5hqbtg1mM-s.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-MBNf4V8WWca3D50qYhyVjhlM2vFpVmSfjCf6k2WegL8.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-9f1L3qlCrkw7n4aD1-JCNvyJYk6ya5v9q7GwSgbsmL4.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-uwKAjowCi-WFH5BiPQL-JN6qZLLOufxAppw49nRZIJ0.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-AcpVWju9bOq3053SIPYfAhzdksCSPjbgVyXuEwt2vMs.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-ZNAvAkSsgQkTC1kIjQuqHdCNSxOo6uNNrOjIpjszqtc.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-Wi2GwnTLybnBALWELmhzpBo3PrVOuKAj_xGmaECCjdM.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-61XNK5G8igqfPHT7U2SmlfN1yrUGZ05MKIrOsy4l4z4.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-iuxQBRypljBIdQ29poDhPIloRC-XfF2Ug1HrSOK_eAo.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-MfNUyNfigr7NNOTWFJkxPcKgoZWNUs5EkrcgI8oXAVw.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT--EVIK8h5JhnfM2hrTSPVedNP62-g8Rp6OKnpddnUUiA.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-iLXbcTLUVrHnpsXcgfMO6hiED6Q8yZ6L8m0Gghgb0CY.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-P5oH3se8k34mPWViJu1UUEOqAN3nXzjYTxJEtzWvBtA.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-ucWRDcXRa2Eb7G5scJPNog0_BeniWOWBhhZV5o8j2PI.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5233357371866929568.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-k80sIeFheS2JHhi4FI0K_PR9-VQaeHPn0rdzzIuKWOE.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-CWMPfw3dHG3qABFVYG_cKXHe56axxUN12Y3_4VBmSjQ.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Kj9OExMv6DACLwiPTp5YTR1jwkcW5h13tDlEEdNIiJM.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-bmSTCIlc3PzzezYwPvL8-nS7JEf2uq00-r3o8UiHn4w.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-eiAhCKad16xXNOwn1wa5K5VengKXVOal8cLJiI3cV8U.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-dB_xXU3timxopKgs6nvMVvz7C_BiSXY6CNbJvXRc74M.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-kywYuLCvwp7y35sx6skPtlCwnbPVufPr--dXCzmCUZw.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-QQhlU29GybJJglouqy2zBAroxM2t-WQlwBoY0nyqu9o.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-6BS4PBqBdvM01Or8mosUraiIKNJSl_b0xym3Owa0_ns.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-CD6E0TmTfuc5S9P-49D7Ix-iLek5sO5PnzpDMLH8m24.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-r-JIiup1R0jichtRFfxu4ePAHzgBeW2LVWzGskgFXMY.jar
    Oct 08, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-MZBCRWAwBytKKqmyfO8BUmBPlJ_C-lROSbgnkODatXk.jar
    Oct 08, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 08, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 08, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 08, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 08, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 08, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 08, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96275 bytes, hash 2e8b4a7bf13389192af1671bd1af032bdb4b6f3a7ee6fa855a2a828564eccb0d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-LotKe_EziRkq8Wcb0a8DK9tLbzp-5vqFWiqChWTsyw0.pb
    Oct 08, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 08, 2020 6:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-07_23_45_16-13038547261317505495?project=apache-beam-testing
    Oct 08, 2020 6:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-07_23_45_16-13038547261317505495
    Oct 08, 2020 6:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-07_23_45_16-13038547261317505495
    Oct 08, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-08T06:45:16.571Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 08, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:45:31.151Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 08, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:45:31.872Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 08, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:45:31.965Z: Expanding GroupByKey operations into optimizable parts.
    Oct 08, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:45:32.008Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 08, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:45:32.068Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 08, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:45:32.095Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 08, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:45:32.126Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 08, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:45:32.161Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 08, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:45:32.563Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 08, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:45:32.647Z: Starting 5 workers in us-central1-b...
    Oct 08, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:45:45.107Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 08, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:45:57.769Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 08, 2020 6:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:46:15.716Z: Workers have started successfully.
    Oct 08, 2020 6:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:46:15.758Z: Workers have started successfully.
    Oct 08, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:46:50.725Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 08, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:46:51.101Z: Cleaning up.
    Oct 08, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:46:51.214Z: Stopping worker pool...
    Oct 08, 2020 6:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:47:45.952Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 08, 2020 6:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T06:47:46.033Z: Worker pool stopped.
    Oct 08, 2020 6:47:55 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-07_23_45_16-13038547261317505495 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9c0fd44c-63b7-4b6b-9f1b-e50338174ef3 and timestamp: 2020-10-08T06:47:55.109000000Z:
                     Metric:                    Value:
                   read_time                    16.274
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 08, 2020 6:47:55 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 52.644 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 39s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/netj4cmcbkw4o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1088

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1088/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-11034] Avoid build-up of stateful garbage collection timers for

[Luke Cwik] [BEAM-10670, BEAM-11028] Ensure that UnboundedSourceAsSDFWrapperFn

[Luke Cwik] [BEAM-10997] Close currentReader in trySplit

[noreply] Merge pull request #13001 from [BEAM-11041] Matching job creation and

[noreply] Add python schema inference docs (#13005)


------------------------------------------
[...truncated 281.97 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 08, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 08, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 08, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 08, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 08, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 08, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 08, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 08, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 08, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 08, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 08, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 08, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 08, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-mcSNKlXB0bYiBymmv-nczQh1B4nqgVQT1-PFWAojT8k.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-_wiOV1Z0kN_yX_Q-YMpwum-z1B94pUwWMezgNUFSWqs.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-gmGPIbEFeOjNMszAbq5TNlq_nGYvywXtdS7D7WQ-HD0.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-5IXajaaMZ_nPTnwQHBu4xucj5L2uvJ7l8ZMJMz02jqk.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-1qQ3nO2bvKFFJRwTYzfwD7zBRTEvSiq0aSkXZDJQjBY.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-mcSNKlXB0bYiBymmv-nczQh1B4nqgVQT1-PFWAojT8k.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-EVWEw4L7KGp8tT5Ials51cmDYO8wkz8EMwrXPDhdp8g.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-PxfllqWiYVrow2WlMmWQPWe-NO_I4cY6Ow1tvhyvGf8.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-ARQY8AHCFzLwvw7LFMilhgubV_15Xxu7A4lqPsrbk2M.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-4A_wlHUsSy4Jr55uR7-Uq7qUgFrtAwwqEsTKVDP5h2A.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-qxBiYfZuLQyaBKPCtea8d-cjmtX-xn7OdEOkotLGoUs.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-hec9DP4nkljbK_tCOrQfujMFKF4UFVFcvO1RvT4_mc8.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-0raXfZsM-NUGkpMYYzxOOo-fpQaPa7zmPrMqnEvJ1Hs.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-nUZOIKI9Olyf_gOxko9-6mCLLqswQ4Bf_-GCOKaNkaQ.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-ivaAwMcAzmXlRMFP_ewMHKfs92pFAsQ6WoCMS3cekOI.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-tRGmEEeWCXGfwAI4QTqjxSQEhzS4pl140rsSdY-WsyQ.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-MZac7b4A2Y9ZJDszSw97IUhD9I6jOurDA0tE_L7p6fk.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-ujIzdw7o4sCmcZznmy02RGpfwwQnEHj1R5c75j2xRxE.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-uIq_5M-5fwh_Oj-0I7hKx_uy_97Zrb03zdVGGgNqsqk.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT--VkjAvh55FeQDPFq689AC351W2TPgkA9MhA8vXrF-G0.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-C1E5hTyeU0JeZqFVC1l17N6pXsE886rjYNMDQAxg89Q.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8286117872923329153.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-8TUwIUrXvjtrzclrdw1pUeZ6BU_SxdSnHalO9bbpVH8.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-aB7czR90wsAP0Ap_e-2kOIM2YzjhKzH968_GeB1mQ-A.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-U-zR7jRBPbw6-2G4yBbMPqZ15ViG9oaXSF2pTKFGsKE.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-zY_KmNsuw3HptrZ67Tq7kax2kDRZxlIy0mWA7QOKkGs.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-XX_MqzaQCE6QYqxDadVfUWCQ33hD_nbBuM_M3KBzhyA.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-rMiDs2f_m5D8q232414Adr8ln8WMgf7rHMknvYP0bQU.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-HSUDJrIt4VrVDmJvffgIvfPTYqmBerehiCYMbALus54.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-reqyzM9uZadR-CDQRjEzLC5CL-7jZTigPk1OuzkgBcA.jar
    Oct 08, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-5-WvUK_0rScf7UnyhwtAhO5xAubLuk6_e7tUU38pavs.jar
    Oct 08, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-O7ZRYTU5NGi5XR5Q0YRcFn4GAej82JWFgJ0rrY0gQuc.jar
    Oct 08, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 08, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 08, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 08, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 08, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 08, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 08, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96274 bytes, hash b5d62d0d7e8d9cff2f2a6f9cf6def1cc61e6155210fa71819cd9117344bffbd3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-tdYtDX6NnP8vKm-c9t7xzGHmFVIQ-nGBnNkRc0S_-9M.pb
    Oct 08, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 08, 2020 12:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-07_17_45_44-2818395155776791685?project=apache-beam-testing
    Oct 08, 2020 12:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-07_17_45_44-2818395155776791685
    Oct 08, 2020 12:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-07_17_45_44-2818395155776791685
    Oct 08, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-08T00:45:44.048Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 08, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:45:52.924Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 08, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:45:53.601Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 08, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:45:53.642Z: Expanding GroupByKey operations into optimizable parts.
    Oct 08, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:45:53.674Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 08, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:45:53.929Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 08, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:45:53.965Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 08, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:45:53.993Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 08, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:45:54.017Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 08, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:45:54.435Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 08, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:45:54.521Z: Starting 5 workers in us-central1-c...
    Oct 08, 2020 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:46:19.714Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 08, 2020 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:46:19.755Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 08, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:46:25.125Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 08, 2020 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:46:25.992Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 08, 2020 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:46:39.177Z: Workers have started successfully.
    Oct 08, 2020 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:46:39.220Z: Workers have started successfully.
    Oct 08, 2020 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:47:23.486Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 08, 2020 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:47:23.624Z: Cleaning up.
    Oct 08, 2020 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:47:23.699Z: Stopping worker pool...
    Oct 08, 2020 12:48:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:48:15.675Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 08, 2020 12:48:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-08T00:48:15.725Z: Worker pool stopped.
    Oct 08, 2020 12:48:20 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-07_17_45_44-2818395155776791685 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 90695272-6580-4de4-8351-e34e9205fd2d and timestamp: 2020-10-08T00:48:21Z:
                     Metric:                    Value:
                   read_time                    22.707
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 08, 2020 12:48:21 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 51.742 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 4s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/ed3w3bwybzxha

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1087

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1087/display/redirect?page=changes>

Changes:

[Boyuan Zhang] Update the test invocation cmd for poratble runner.

[tobiasz.kedzierski] [BEAM-11035] Pin versions of untrusted 3rd-party GitHub Actions

[tobiasz.kedzierski] [BEAM-11036] Add explanatory comment to PR if GA workflow gets cancelled


------------------------------------------
[...truncated 272.04 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 07, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 07, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 07, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 07, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 07, 2020 6:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 07, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 07, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 07, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 07, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 07, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 07, 2020 6:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 07, 2020 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 07, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 07, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ae98b2EP9ktYgMSmvcDYydIZ1fFyNCoFbYGeESLghFE.jar
    Oct 07, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-12ioFpMFMRkBB4s1W9pCNI0XN5bhAQKhu8zn-9p4VG4.jar
    Oct 07, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests--QBCCm3YGAcqkiMpHuY9X5Bfti-sDrXlgC7OA3Uy49Q.jar
    Oct 07, 2020 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-g4INTFckx82zn4x3u1lrZ5sjHPEs4exXC5jP7OowIbI.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-rfaA2ik5hNQ_WJbThmh0sYkDoc4uv0kk8L6ABG-ZTkw.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Y3_6qHuxhgm7AB79FLKe4MpoF_Xu05T8Yo3nVzOBcNU.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-BhdIn_qsPetlpXGTNSGbfhkGk0wtS6BHlrr-XIZcQrY.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-moXc5Gx4KzUnb4ZP71TKbLbMX-wADXyfa3-tnZbZayM.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-F3k9BuTHFx7EtGhO8sB-f4GZu5re4M6chGua69qzm34.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-assdikcfVP5ax8zfmg4ZNCeKAdI2wJlFIvwtpcjN3fs.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-jko6gY4_22LeKOBgGoyhLT-9Jr6CJUMDKHQcD7eybVE.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Tpmi9Swar9qGwXIZgFRezuROfl7tJeLVrZHYsCmPNUw.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9004092158319563373.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qou12UA96XsNxUAz3lDizbU7Ee67Z-F_xvyo0a55gy0.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-rnNPLznl0-Dpl7SakINHUWQwVKEbwnoONXXRPnojdN8.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-yPvnJMx-fE4HtfTqTRFPG3O7LvBw4Hw4TZd-UJgvbnc.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-oEbATHs_sR23f2AIaoewf563WwoiNCYFJEs9TiTSixM.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-TF3yhWg6Ce3lGRLobZXb7WG3jUx9idDMMrBtwGODXXA.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-dbCbuhISs4F-TGFRS9OBpx-MN_FrICZvoDAzM6uuzms.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ae98b2EP9ktYgMSmvcDYydIZ1fFyNCoFbYGeESLghFE.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-CWaR8EyfA7SUCNOupGQ76Gf1g36woznUPYBJtrrZcbI.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-CYGq88TZcLC6SJO1edOfBGhEacz3TjxKf5N-qy6DWUs.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-pWEhpB5uGjmR4T5ZQ5afFSOJpZEGnhyn216-wS-Ha7I.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-INjTv4m6N7cb4ZHaXOdmw7Wh-e-vbPydFPHzCAp4P_o.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-ASzAfZhpXUwzQfy8XQIDkkZewZB4PRdxyl4r_2ogClw.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-9fPTWPjFiI0PvJSeK7i77KpLwSabNsW48xhLKpwasuI.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-y29nIu7c6WxjwkJSs2Zi8yPX42BMeMK8HUwCKoKcg5E.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-kwxploP-RdlUXpdv1vlox4AbFbKQfB3Is0w8oW9srZE.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-yhccfnnetpqWtBFsd12Q5g_rfXkiEim0z1h8RBhRFUk.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-NLxxwSRXvoKHuyAMnePdbC49RbH-xaOL5zVelDy39i4.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-KFMYKF5poeI3LsGUndReKx-I1gA-0i2dl__wDQry2B0.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-WKpbqNiBG5c-CZyIdAGLOcvfYCWagGzcIZ-Uzododlk.jar
    Oct 07, 2020 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 07, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 07, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 07, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 07, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 07, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 07, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96275 bytes, hash 1aa905231a2f40c18a6ce4043a3dcbd7b059aa090914ab52e7e32b0aed597a9c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-GqkFIxovQMGKbOQEOj3L17BZqgkJFKtS5-MrCu1Zepw.pb
    Oct 07, 2020 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 07, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-07_11_45_15-17746700083485615797?project=apache-beam-testing
    Oct 07, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-07_11_45_15-17746700083485615797
    Oct 07, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-07_11_45_15-17746700083485615797
    Oct 07, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-07T18:45:15.623Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 07, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:45:23.818Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 07, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:45:24.545Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 07, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:45:24.586Z: Expanding GroupByKey operations into optimizable parts.
    Oct 07, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:45:24.624Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 07, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:45:24.707Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 07, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:45:24.744Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 07, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:45:24.780Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 07, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:45:24.835Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 07, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:45:25.593Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 07, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:45:25.676Z: Starting 5 workers in us-central1-c...
    Oct 07, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:45:51.514Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 07, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:45:55.935Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 07, 2020 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:46:09.134Z: Workers have started successfully.
    Oct 07, 2020 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:46:09.162Z: Workers have started successfully.
    Oct 07, 2020 6:46:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:46:41.533Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 07, 2020 6:46:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:46:41.793Z: Cleaning up.
    Oct 07, 2020 6:46:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:46:41.896Z: Stopping worker pool...
    Oct 07, 2020 6:47:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:47:31.422Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 07, 2020 6:47:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T18:47:31.486Z: Worker pool stopped.
    Oct 07, 2020 6:47:37 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-07_11_45_15-17746700083485615797 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 64ce862c-afab-4424-8998-49ecb53ad353 and timestamp: 2020-10-07T18:47:37.294000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.375

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 07, 2020 6:47:37 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 35.828 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 22s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/npm7igtfqdxo4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1086

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1086/display/redirect>

Changes:


------------------------------------------
[...truncated 272.67 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 07, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 07, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 07, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-KU3X3_llNVK_FMCduOmuKfn1NhQpMc-av_wCb5-efog.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-1VVyaHHMMaWG55LEs-E4rlw5pWroGI8OqsJkPWfZN0g.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-LYET2oF1Jjy3Iv_LGDkkXiYv4U15Xd5xoZm3wRQkAQE.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-r7R_TdBkbqcOTuJyMteQ-zIXXhHqdL3gbs3hSzkmPi0.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-JUmDLwJvQ3wOeJnPSaosecWfnqzVkvgOy5wJAWwuylE.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-oUNuHxe4sv-ZEtnvUAMqzTAU2MXayDDbME7zB8RkF9M.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3239090089145729629.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-T8v7tu6zLTda4Usd5-RRsrHsPo-yIcuoc0ogvgkCg3s.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-oBsmnh3xeHKusGZbu9mONPwFoKF68YHLPmDR9Wqogjo.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-2WgcUuIinCUrVx1oRtw41ghP8JPxJrt4QxIIrls8_UI.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-Au2fWw-sGJAbYWv7aO7OCRoaA3Xz6UaL3RWv4yyfRTk.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-5EY1AwOnE0KqvdvM5t2uUUJ6ulSqvHhXtbR6jo2yM-8.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-eDb52Oi-iijyRYikmVr-c-Y4vfHvaLVLALM1-A6MqvM.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-WdV-hG4p7RA6o42YULx1OgpDjI8_boyHtdtlVSPqHw8.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-n0Kg2tqGWdicQloXaplT7f4ErKXKjQ2I8WKsJ-Hhgo8.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-CEO9qhWaS8kQ5U_fcIw-KKeA4dAJv6UzQIXUJOKAWS0.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-0dUoMozILF3I4CLT1nivedx2lyNTPFzzSUEJDRMolzw.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-TOe90xandoq2dknRPt6Isr7v4s4ZwYk7WGzqqYXkzT4.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-nfSXpi-IdAWzwGpmKZ18Tj-hJK3Qt9QVSDsnvejN1nw.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-szxCRh6USqO2R23EvN-Kp952ebeZ4XzFKtcZ9Ti6XeI.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-pPkBcSmuAAY0xGbzao3s0ezrGjcchCCDViX18UDtzNA.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-9_sSEPCR7dIrgG4Cyv0z1t_ck68YdpgD9IcV6oi4AA0.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-AImz8xUbMGaNrZHVD2V-3HLWZzDDcifsEk8CvuLfIZw.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-KU3X3_llNVK_FMCduOmuKfn1NhQpMc-av_wCb5-efog.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-MWGT3ePivKceK7aNH1mqXw2U9p218sB0n5aIBn2_tB4.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-0PIcF7z0Nt1gDsXQ5yac0XEp4WhmmLyFiCSaMrE-grI.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-sYryY_y8K_wowVNegpr9e2Pqx38mc6p6WBF7gLM3KHY.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-leAfXVcxd7J73JLTe2YCPetDnFb-fe7LOiNBEeGWfmo.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded--d7J0JJIE1LSk2auWZso9K_S8bp3R3-JZM8Qaan5pQ8.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-RuUEfUuFy0qHlxIs9ol3IWWdr9JIZvJIgvG2t6nSnUM.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-luGauQWSdefQLtcrQUvbUg9V7cZ1-bgAmN6jQn1D8ik.jar
    Oct 07, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-cauONAmRsr0SjqiQ9cQUuezxbE6VpLrD4O4i8HSWlto.jar
    Oct 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96275 bytes, hash f6288ce03cb7e7dae5d08946fcc07c69696d5d43ec1bf4215b1fdd0bdd911fd2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9iiM4Dy359rl0IlG_MB8aWltXUPsG_QhWx_dC92RH9I.pb
    Oct 07, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 07, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-07_05_45_21-682441098795551969?project=apache-beam-testing
    Oct 07, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-07_05_45_21-682441098795551969
    Oct 07, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-07_05_45_21-682441098795551969
    Oct 07, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-07T12:45:21.904Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 07, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:45:30.732Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 07, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:45:31.391Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 07, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:45:31.431Z: Expanding GroupByKey operations into optimizable parts.
    Oct 07, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:45:31.457Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 07, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:45:31.560Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 07, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:45:31.586Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 07, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:45:31.625Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 07, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:45:31.682Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 07, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:45:32.246Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 07, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:45:32.321Z: Starting 5 workers in us-central1-c...
    Oct 07, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:45:43.485Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 07, 2020 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:46:00.232Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 07, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:46:20.255Z: Workers have started successfully.
    Oct 07, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:46:20.345Z: Workers have started successfully.
    Oct 07, 2020 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:46:56.130Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 07, 2020 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:46:56.273Z: Cleaning up.
    Oct 07, 2020 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:46:56.346Z: Stopping worker pool...
    Oct 07, 2020 12:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:47:54.743Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 07, 2020 12:47:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T12:47:54.782Z: Worker pool stopped.
    Oct 07, 2020 12:48:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-07_05_45_21-682441098795551969 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): dc965b27-80a0-448d-b738-532cd42b4ff9 and timestamp: 2020-10-07T12:48:00.089000000Z:
                     Metric:                    Value:
                   read_time                    17.729
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 07, 2020 12:48:00 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 54.283 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 44s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/ngzblncmyyrzc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1085

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1085/display/redirect>

Changes:


------------------------------------------
[...truncated 273.65 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 07, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 07, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 07, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 07, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 07, 2020 6:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 07, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 07, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 07, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 07, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 07, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 07, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 07, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 07, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ZvOlgK3b3vDKlBIvh0G-madf_D8eGzFTMgPUVQ9Ho2c.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-2QEV6ImYnJsdZFHH4wROnQE1gZ_Ay8xCPtYrQbHNF7g.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5194836340388806633.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-eTBtWiNqnAur5HXQEFVnL7f1Ibp9-eY2-CXCmL3BevY.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-TTI91vAecEJfY9U61lBmz1QbUhTM6qr48aVuxv1GVXY.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-fYagrrG4lQLgXyw-8--syvtUsYqc6rwDR6EiHY3XOJI.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-tCB-tXOIAAn4LmoSmikKJyrAIlowEfF_rlDB7aoJhPw.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-YJ2e34gk8jJNMq9if5Vqez3K_t_hfG5srCw1aZ4xGVg.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-SRMJVC4fl9Qn6vmtUEE2h5LPdDj865jLu3suNtxkmpQ.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-DkP60yiVv3poIv913eg5SEuCs4ivN2Os68V_pDhSC7A.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-5u_GO2sSfqfOqaBfvybStxbd6lE1r5F5WqyI1Ru0d9Y.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-Mjgp4zAsaIi-LE4RUJOkdB1S90VFMVNaaQD679ajjU0.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-jdENVZHpHcYYW6OkjqSe2lYSWhcGjmiWcRt0VrkIdsQ.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-ch7Erq9L-lLZw6UokE_yVkLtLRVau6YMlC8xHX8egMc.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-JeRzQvNRAaxb2v8RneT6lnRXMs9XqESLiBTIvVF6uw8.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-qfI42oL9oTRONhYIGXnOm_QxjK2pZHSaZd6qrSOB_m4.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-nlf7GhPi4dlCeWX0-jmTvxN6NPkjNV-6t2HMpApb7B8.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-BjG3zlLPUcz1C2zhooZtP9G4O_Wv3ef0UoaoTdk4FSA.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ZvOlgK3b3vDKlBIvh0G-madf_D8eGzFTMgPUVQ9Ho2c.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-nSBkjDpRvYcwSuLkAW4i-KMTwEqibnk8Fi7KvQWCWdY.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-hzYmJEYl5eFgFVn-zkHCsUHUOfZf2sdxdZJUcI72tUU.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-Nc193fsHDYFvw-WPzofTo8HId5bCqgDGGp0oOp_oyXk.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-ZYE_-GJsdKWKOAwBCHUte_7EX2BUJF7l2Pyf6v1jhsI.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-U39yDdl7TFgtZdLV3TG-Hl7vQJ3SkYwTCDTZtUiGR40.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-4LcCA56Z1igQfUIOSF-lEEHS8AZPaf9D6hUZS9EKRHg.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-KyUXsJJp9UUQ8_UmMXoexWUn4J4-vJU5t-iz483iUU0.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Ep_BNhIj0ldSCNkkpQ0XRi8NPheJd3Fb-Zn5-e22Fww.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-RQv5kLzsgXJfaWdhlveQRFz4CkLbLhWwyOGMH38XUGs.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-bv3cTBLTyvbgrGsXWA_vnRuUvG5bOuoZrMJJ7PYMGjY.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-y6ryAAKxeN6oGhfZhaLSe0vqisZzDZtOA-caGfitaNE.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-Fuxo3U3LwWn41VJ8wSA96k-xommzgPcaBJti7YIDzDU.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-lgAbfXsGJU5Oh_Jd9k_vHcM3YLkjKbn6zJTWTNgN8BM.jar
    Oct 07, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT---fBy-bQfP7xWcgTV0_CHDxZcdIBoME-yGfEpV1S584.jar
    Oct 07, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 31 files newly uploaded in 1 seconds
    Oct 07, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 07, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 07, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 07, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 07, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 07, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96276 bytes, hash c38e73180e67f7d78f5297f15e4278f813880a1cb989e786477409a9f0c49cf4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-w45zGA5n99ePUpfxXkJ4-BOIChy5ieeGR3QJqfDEnPQ.pb
    Oct 07, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 07, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-06_23_45_40-12955445802791502328?project=apache-beam-testing
    Oct 07, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-06_23_45_40-12955445802791502328
    Oct 07, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-06_23_45_40-12955445802791502328
    Oct 07, 2020 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-07T06:45:40.875Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 07, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:45:49.795Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 07, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:45:50.407Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 07, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:45:50.436Z: Expanding GroupByKey operations into optimizable parts.
    Oct 07, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:45:50.463Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 07, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:45:50.532Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 07, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:45:50.568Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 07, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:45:50.601Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 07, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:45:50.636Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 07, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:45:50.983Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 07, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:45:51.064Z: Starting 5 workers in us-central1-b...
    Oct 07, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:46:12.551Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 07, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:46:22.558Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 07, 2020 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:46:30.816Z: Workers have started successfully.
    Oct 07, 2020 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:46:30.849Z: Workers have started successfully.
    Oct 07, 2020 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:47:04.232Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 07, 2020 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:47:04.388Z: Cleaning up.
    Oct 07, 2020 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:47:04.471Z: Stopping worker pool...
    Oct 07, 2020 6:47:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:47:57.145Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 07, 2020 6:47:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:47:57.192Z: Worker pool stopped.
    Oct 07, 2020 6:48:03 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-06_23_45_40-12955445802791502328 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 32cf17f7-0d0f-4c02-a0dc-84dfb5952d1a and timestamp: 2020-10-07T06:48:03.384000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     15.49

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 07, 2020 6:48:03 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 38.763 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 45s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/rsoyvvnpsbwos

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1084

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1084/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-10670] Improve messaging in CHANGES.md giving clearer instructions

[Luke Cwik] fixup! Address PR comments

[ningk] [BEAM-11025] Make PCollection Visualization tests robust

[Ismaël Mejía] [BEAM-10670] Support impulse within non-portable Spark implementations.

[noreply] [BEAM-10937] Add first introductory notebook (#12892)

[Ismaël Mejía] [BEAM-11027] Avoid excessive logging of ZetaSQL on Nexmark run


------------------------------------------
[...truncated 303.45 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 07, 2020 6:27:51 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 07, 2020 6:27:51 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 6:27:51 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 07, 2020 6:27:51 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 07, 2020 6:27:51 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 6:27:51 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 07, 2020 6:27:52 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 07, 2020 6:27:52 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 07, 2020 6:27:52 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 6:27:52 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 07, 2020 6:27:52 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 07, 2020 6:27:52 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 07, 2020 6:27:52 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 07, 2020 6:27:53 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 07, 2020 6:27:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 07, 2020 6:27:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 07, 2020 6:28:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 07, 2020 6:28:01 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-WMqT_PiKoQLbdJR6MMVOpCAn7I3iH6CY7n1WCxEnIrQ.jar
    Oct 07, 2020 6:28:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-UzfCLXjod6X1WKKXI8dxDMCCbRH0AEtccus0PNrqJOc.jar
    Oct 07, 2020 6:28:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-3cmMKgG6_z_RPSlR1TfeeCWo9LTU9KAiFp8y9G6MMdw.jar
    Oct 07, 2020 6:28:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-PFFy0uCuGei3At61G6nIQvar31jR2JjiOZPafcRB8Pc.jar
    Oct 07, 2020 6:28:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-nbFnNBOs23FQp4fL1491OPEee-RX9cI1C9vg_6x-KOQ.jar
    Oct 07, 2020 6:28:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-5KsWz28J2JMOYq2yr7CyviWkWKv2ZTAjcKpX3VOIczE.jar
    Oct 07, 2020 6:28:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-7A37sT1iL6EbTmy0a1K3PFvl6ng3tF84bTCUAGaJeJo.jar
    Oct 07, 2020 6:28:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-GX15yH6ie97j_WAgM2Gb69ZsyH-9UKzj8eji6MfuotM.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT--o1wKu7rGSwzRn8w5sfAs9MUA_9LoF8OTOJw4a_vLho.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-Kq4n_XvEJ6UXpdHetx2CBhVHumNuZtGwL0sSlTFBiHw.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5150289568725898321.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-frcqWkzngVcM3LI_dvWDTmFG8V29JGqY18lxZUDcKiE.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-DDpbhAQDGjIR0h70f8AyEcateVYgYaTc5IGZZ07rEQM.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-WMqT_PiKoQLbdJR6MMVOpCAn7I3iH6CY7n1WCxEnIrQ.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-65KiROC5zZQq-R5RyopkUEgZO2mlzyi4R6YTPf0trTI.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-GZArL832QIGO15X5PCXsNENhB9Dk0KtuVcpnTUJloyw.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-GFsg48NCy5a6i0qjp-D6qEX1Rm5KUqTLT6vFfmLXLIk.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-aPXCvXcL4VSpxaNslS3Fr_LoOhSbDeFAPTRw2RXWJ3M.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-fTBGRCbte4QDaqamyGfeWDzICAhKnVxy0ieqYuO0EPA.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-IQW7QWlqS3-JlRcCy4n2LEpfpON2tBQjIKhqaOqCWe0.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-faNnE9ojsypBa8qu2slQmxeMs0lmJ3ZuifdzalwvV8M.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-xHcbeNmdu9MWXWFdZ-TsVl3kn9T2_iAOvNULbkFf_Zw.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-xrRlFc6ZPE2RVjaRRWkk_iZvreZDCKDAWP75uhMSwmE.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-uvpctmcBZJljQRcBMhA5vJlvkMnu2B0SgrIiuScxhjY.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tIVoi9k0d5yvUM_Ij7CDp2oKxv7QEDBUu32iq9YzsJw.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-93TbVhDQ-y7G3HI5XQUTtAoxEiASNLrShoSGqo2mqnw.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-7vGxVFFdFrNmunOgYvlQ5DLpG1XXNzXJ4U8q-WFLXpI.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-G1v0Vz9dc5nkBe2sTiEyc7f_NAr2doQM2X-UKtiUAlQ.jar
    Oct 07, 2020 6:28:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-Jba2IbC9lPYHOlhjk8hlSF0T_YWeFBP92NngZmUZj0c.jar
    Oct 07, 2020 6:28:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-JXXKm94FvO0OVcyfga-IpO83JHiW9ZhIZ32bGisHCYc.jar
    Oct 07, 2020 6:28:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-EISKmEdFOUpwEZrd0O_fbRo_TuI8bWb3Y8EgEyl3fl8.jar
    Oct 07, 2020 6:28:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-KfkMOt6CMRpiQvBLxULrF1A7JEqK5fLpNJnrXTLcKJs.jar
    Oct 07, 2020 6:28:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 5 seconds
    Oct 07, 2020 6:28:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 07, 2020 6:28:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 07, 2020 6:28:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 07, 2020 6:28:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 07, 2020 6:28:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 07, 2020 6:28:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96275 bytes, hash 7f84682b410921b4d11831aafa585d32b45b042521a8959fc900ccb8a68a9bba> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-f4RoK0EJIbTRGDGq-lhdMrRbBCUhqJWfyQDMuKaKm7o.pb
    Oct 07, 2020 6:28:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 07, 2020 6:28:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-06_23_28_08-7952897450768427380?project=apache-beam-testing
    Oct 07, 2020 6:28:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-06_23_28_08-7952897450768427380
    Oct 07, 2020 6:28:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-06_23_28_08-7952897450768427380
    Oct 07, 2020 6:28:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-07T06:28:08.913Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 07, 2020 6:28:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:28:16.979Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 07, 2020 6:28:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:28:17.791Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 07, 2020 6:28:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:28:17.834Z: Expanding GroupByKey operations into optimizable parts.
    Oct 07, 2020 6:28:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:28:17.857Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 07, 2020 6:28:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:28:17.953Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 07, 2020 6:28:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:28:17.989Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 07, 2020 6:28:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:28:18.017Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 07, 2020 6:28:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:28:18.042Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 07, 2020 6:28:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:28:18.442Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 07, 2020 6:28:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:28:18.525Z: Starting 5 workers in us-central1-b...
    Oct 07, 2020 6:28:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:28:29.178Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 07, 2020 6:28:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:28:39.374Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 07, 2020 6:28:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:28:57.989Z: Workers have started successfully.
    Oct 07, 2020 6:28:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:28:58.026Z: Workers have started successfully.
    Oct 07, 2020 6:29:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:29:29.815Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 07, 2020 6:29:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:29:29.979Z: Cleaning up.
    Oct 07, 2020 6:29:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:29:30.101Z: Stopping worker pool...
    Oct 07, 2020 6:30:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:30:20.878Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 07, 2020 6:30:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-07T06:30:20.939Z: Worker pool stopped.
    Oct 07, 2020 6:30:25 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-06_23_28_08-7952897450768427380 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e141a2f0-67f2-47a3-9ed8-838cc2941255 and timestamp: 2020-10-07T06:30:25.988000000Z:
                     Metric:                    Value:
                   read_time                    13.131
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 07, 2020 6:30:26 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 3 mins 13.214 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 48s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/2iyz3piuetzbe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1083

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1083/display/redirect?page=changes>

Changes:

[noreply] Update Dataflow Python container to beam-master-20201005 (#13011)


------------------------------------------
[...truncated 271.11 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 06, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 06, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 06, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 06, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 06, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 06, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 06, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 06, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 06, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 06, 2020 6:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 06, 2020 6:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 06, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Kc2bLNvNH8_LVSeo81fFmVazbGwvnCiMah5fvTtSpzg.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-qly_xV49YX4sHLYNAy4DT0ubOYDAP0MdlyhCmpUbYtg.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-wdBxfgsG5sWe27g91T9pjLzIKI-non35xK8S5wYZMhs.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-EHuRLeW3G0SFl1sNa5h9dQLus4jgo76h1BonJ6yj7Rg.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-H4PMIc9w3fDU3ICJ3v0LqDHS2ilj4ZNAAQZrHJFJiNQ.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-ggNejFW-EyhYuTwsJ6kT6pK9rO8kAahYzCDD1MV8ZSs.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-jSXfnUJNUFICDys4tvf-MYbvlD28UA7ohTFyS52hY5A.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-rnMcaGvDKjhInBOhGUkRQcd7aHzH7PZhcadFUQ4aqyg.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-J9gPZZv5ntFEOYp2wOdUUcHpKPIirrYRNUx50UZTD1w.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-xggJ4wIR3Ru_-YyjLWrBzMQJEH4rYIsfS9GkQDEvAso.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-bs1MRpLkYIreKHSTRgFDOK9HN7enJM48wM28WSIuBJc.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-KW1IqK1yijnCrVA3wALSRvhvFTrNu77XqIZBP2ozTig.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-CeydnqGWdPUXV3tKlD4nuTV-VLVhqt4YzAX60QlrEJc.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-DctomvhgEc61FMK6kY4sZQB-VqK90AK_EO0gUUIpEfE.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-duxiQ4U72bGn1sbC-hx28_y4bxGp1l__zlgdqzUsWDo.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-agcXKuz2jmR87wuXmpim8woW0mQQjz7glP1x_6lF8_M.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-FEfEw6Aa-VSIoJS1YJ6TD_yEodzxvFU5UgYko_MT0ms.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-1cjtVQy-OVW0Ls5HgmSViW6rp5UbYFCDMZrKj9dZm0A.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-dVwPcFXimx8wmI-ZNC1lLa3l5inPOp2r0YJJABkOFNc.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-KLWq7dHAD_mMRjH22-bRIYAUSVag_bwm_8ny3SUP_FA.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-v7DQ2sMjVhsNL41NQBOIa-BUVrOl74lJbvaGuKTzzAw.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4308786702464570260.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-cAb0ZNeMRceN6gDlbTTyrH6eJlx0OLAaSL9zSmdrHpc.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT--NamuN67RPBwI-QH0HCcqiMcYrBwCT3kr0-H2iz2EFQ.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-5MM1da0TsWOh7o0FXyBrNd06qAvU59ja0oHQSz_pnVc.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-Mqcyu75iYxuH8CTlErrwHF6XuB7Wj_0c-0qezSeDca4.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-ccma4YfGAD8bTIayP-c9cuE8kVOCV5GTbNLN6DSG4X8.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Kc2bLNvNH8_LVSeo81fFmVazbGwvnCiMah5fvTtSpzg.jar
    Oct 06, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-mFQBPtuq0KHn7qoU0HDUL24bCdtYBKnUzU9Hr0HQyQY.jar
    Oct 06, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-gFycG8reczUSyaBuBvlO8fSYfkcu76_A4eS-OyU39hc.jar
    Oct 06, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-vOmCluHucdUzg6UHs-hZrhO2nGlZQDDfqrWncsvsIek.jar
    Oct 06, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-De7XhpW72rMKRLJUHzKLmMTukE8zQDnGrRzlYZFK1qg.jar
    Oct 06, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 06, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 06, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 06, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 06, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 06, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 06, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96275 bytes, hash 8530069bb9f8321a22d8f62a039195166732c0dc6b8c8d8ca1ac941c86556198> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hTAGm7n4Mhoi2PYqA5GVFmcywNxrjI2MoayUHIZVYZg.pb
    Oct 06, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 06, 2020 6:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-06_11_45_29-10178933180447607564?project=apache-beam-testing
    Oct 06, 2020 6:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-06_11_45_29-10178933180447607564
    Oct 06, 2020 6:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-06_11_45_29-10178933180447607564
    Oct 06, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-06T18:45:29.794Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 06, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:45:38.337Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 06, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:45:39.079Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 06, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:45:39.132Z: Expanding GroupByKey operations into optimizable parts.
    Oct 06, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:45:39.167Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 06, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:45:39.244Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 06, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:45:39.271Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 06, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:45:39.310Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 06, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:45:39.351Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 06, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:45:39.890Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 06, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:45:39.970Z: Starting 5 workers in us-central1-c...
    Oct 06, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:45:57.239Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 06, 2020 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:46:07.581Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 06, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:46:26.118Z: Workers have started successfully.
    Oct 06, 2020 6:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:46:26.214Z: Workers have started successfully.
    Oct 06, 2020 6:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:47:06.914Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 06, 2020 6:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:47:07.091Z: Cleaning up.
    Oct 06, 2020 6:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:47:07.180Z: Stopping worker pool...
    Oct 06, 2020 6:48:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:48:04.901Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 06, 2020 6:48:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T18:48:04.951Z: Worker pool stopped.
    Oct 06, 2020 6:48:10 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-06_11_45_29-10178933180447607564 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3a1eea4a-9c22-4b09-9c4e-a94819116441 and timestamp: 2020-10-06T18:48:10.879000000Z:
                     Metric:                    Value:
                   read_time                    20.722
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 06, 2020 6:48:11 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 57.383 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 53s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/kdr3ognzbp73w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1082

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1082/display/redirect>

Changes:


------------------------------------------
[...truncated 270.97 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 06, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 06, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 06, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 06, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 06, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 06, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 06, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 06, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 06, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 06, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 06, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 06, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-HeQBFqAYfEHPWj7Nqfe7JxdQmMMRQXhYiGHbFiw71g0.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-S_kZAB7RQGvCglGwaNQeYo6mj0wyqY31HEWoN1hqtuY.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-mh26PcE28Y8RaGEqnyPXzev7TyRVA9YSF_UirGmNzS0.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-54CP28SaLpLPMElNHIf0Iqpj0ymTolpx2f2ZTSeFqu0.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-PqF5b0Zrm5G-PpXSGVoCbW91L2Inz3-mPG2IHeKAu_U.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-yyXLBI15CxcESwTGeWkpDLYSqdV23FWypqNEFdZd7Gk.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-JY9lJQn1lrD1eMSsW2lTkwwwgmv6w0tc7qkwospwQNI.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-54CP28SaLpLPMElNHIf0Iqpj0ymTolpx2f2ZTSeFqu0.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-PB81RAxlEhq44KqS5ifvdq1YEonQtUhiywh-OTUaD5c.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-YScpxGADTWTt-Vl2uL9Eg526p6n5YiTEnUZTU7xaVz8.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-PEYLoZlHgq8_1yvRiMM4GWzty8_yw4yGuX2bUQi6_3g.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-yuMH3OaVGb784_xqlDijS1akaJKpM7OdTD_dYynNobo.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-lftZi7HHKHaTCVQBObjFtxBO9GwCF4zjz7RqhZO64NY.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-T9CC3_54lk_9onxAqlIaL0VqN5FiuB40uJDOb2PO03I.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test231524584149644177.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Au9ebvGachvqBgW_FuVvKYN3jqO0Zrs_neFUcXDwKMg.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-XbobphTrP5ierNq7HdnzFlMEfyeQYLiH_Z9m2BMvZOc.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-3eGQ2tfNIzQlIpxZUNmvST7KIYje4DybSvnnHulL6AA.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-IWPZpNYcPL-dTpXMvAweXH2i5MS6klzPqAmXyyVUPbc.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-OFlsQUEyCjLAzqSIQbqjpELgoq9u-n2nBRO3THNYSJQ.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-VZ2yU5tFq0ZdFe5UTMFS_XvigiWPk_9BEe-0Wlmeeig.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-dC6NHT-Viizb7AFApqTPKsvCVWT3LZ7ApVDG3uog9SI.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-FL5P7c2CeTzijwSSHExv9hc3f9QfM7seTxzt5hnMYDY.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-lGDtwitB1pmOA5imoIb4JKoKx2Cf_SQD_bsqjlXsHV4.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-h_okV3RpgVqRVErjWVsWLXg8uLXNM8lZ8kjkXGYOoaA.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-u8mH93dKRwbMmU9qpxcVYF-KxbJcqJPE9FE6SOpVohM.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-m0Ipbd6EX2o9PMJY7SNKt8jUD_uXyEi823jZgyMajPU.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-FHizIFJEwClBYvx2APhlb_2UZ7d3ltl39QCAD7NVaqk.jar
    Oct 06, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Aa62890MfpOP3Hj0FgabOjBhUJHmAHQBqxRi3_n81Go.jar
    Oct 06, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-I9QRp_9AyoFWELm-ZL91k9rjiSYzO41jjteWy4P9GYw.jar
    Oct 06, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-vJbwm1RnslaXwBVVUkAi7I2PWgHx59RF1yrJV7SP4G8.jar
    Oct 06, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-t_QtH-YinMJRZjK5RyFBFfRINWFSBlp6mt4MNhxh6wU.jar
    Oct 06, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 06, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 06, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 06, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 06, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 06, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 06, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96274 bytes, hash 2a0908ba2ba6a86dda3874e1b0b82b1e80832c0bae9d2415dbab69777fa585fc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-KgkIuiumqG3aOHThsLgrHoCDLAuunSQV26tpd3-lhfw.pb
    Oct 06, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 06, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-06_05_45_24-8697567479420114513?project=apache-beam-testing
    Oct 06, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-06_05_45_24-8697567479420114513
    Oct 06, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-06_05_45_24-8697567479420114513
    Oct 06, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-06T12:45:24.282Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 06, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:45:32.089Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 06, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:45:32.868Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 06, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:45:32.907Z: Expanding GroupByKey operations into optimizable parts.
    Oct 06, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:45:32.946Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 06, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:45:33.012Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 06, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:45:33.053Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 06, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:45:33.089Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 06, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:45:33.122Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 06, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:45:33.551Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 06, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:45:33.633Z: Starting 5 workers in us-central1-c...
    Oct 06, 2020 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:45:59.437Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 06, 2020 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:45:59.565Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 06, 2020 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:46:22.707Z: Workers have started successfully.
    Oct 06, 2020 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:46:22.737Z: Workers have started successfully.
    Oct 06, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:46:54.518Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 06, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:46:54.679Z: Cleaning up.
    Oct 06, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:46:54.764Z: Stopping worker pool...
    Oct 06, 2020 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:47:45.259Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 06, 2020 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T12:47:45.300Z: Worker pool stopped.
    Oct 06, 2020 12:47:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-06_05_45_24-8697567479420114513 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 44cdd7da-ab54-4f73-b6c1-b0e0d13a3c20 and timestamp: 2020-10-06T12:47:50.516000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.025

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 06, 2020 12:47:51 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 40.282 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 30s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/vuu5aujh6kn42

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1081

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1081/display/redirect?page=changes>

Changes:

[ningk] v1.0.0


------------------------------------------
[...truncated 272.59 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 06, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 06, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ctTwVKvLIeknfp6JvdGfhBUoot0IBDg5n_JGuOdK0hk.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-a9mjtJxt1vuM4Mb6nOMVEipYsnUx3wTy2ooBlZzLn38.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-kMFpoc9ePXjMbvjTEvEpSN3zRBWDwhwKWIJZBsA7Q6E.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2400843021617916791.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-_-nBJBDg2F-i1O0XJCmzY1wyevcOINyjo2LleNH6EXE.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-_yXdypqibpXSdEU8LPQedseYDEvObf03_L0IedtiH8c.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-wn2QZffzize7F6mJ0Zqih2YqxeQysXMoAQdqyH9wjT4.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-9KBa8ZEW2N17CM6ihkMHyUStcGygwJvoRy1TpgbNcw4.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-BtTiO2YDCRyI1r0gH-u-IIb4nPaA3b9PVQ3n8J1f7So.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-w0bjKMMt6kFp595vEggoRtrj7f5KvJ5YAe0chiGbZ3w.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Q1QYM7drueqzgLRa3A85OwPxHKACbJQE2pEWZkOlBeY.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-TpXMYtbAKGSUOvDATp21M0VM6M2g84jlBYy_hbBxY0U.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-CvcMDL97CtGKbVMVllzEu6yqKJzQf9GA8m5zR36raS8.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ctTwVKvLIeknfp6JvdGfhBUoot0IBDg5n_JGuOdK0hk.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-YBEjCTw3f1OaFeV9rSN--TiFC06gz8jYIhLMdSoLXRk.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-z9Ag6U7R6pvpftSZ_SrDaJQOe_mbyTbevSnPWSAXgfU.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-7SLUXu41n7VEwx9AwDlDWUWAtmR6SYR0rDFEBpcZurI.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-uBl7Ojt3aATCboHbsv8W5A9nHe9hrJfm-FwbAtMCiBU.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-340iiU2R3CBhZVZomORz0CjTq78uJu2oOcspiE652gQ.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-RenlYU0bPQZv1FQFHTmCYCv06knnXU2ECLUZ0wXOjYg.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-gpMuQemQFgkhgcZLSrL0aBljhRelhAn06xtdbBroUbs.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-Aq04NglPqj0B9xZFhlaQNyE4Rjh_J7DVwip4xm45WrA.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-pqVQbKcdRsjGO0xGGAg1u3FtLlE2GY0IdpajVctJGhU.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-99Y4NeNf2tRBhiE746Ko_DmSG9gTUXAwaPimrx7MhBE.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Jb8uCy_op4JJAXz_4c0ZhDPXJh7sX6SMfrE3UyPOn2A.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-1StWrk2DErQdQNACP1yOFurqxsUz60-l6wcl5f291-g.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tUWXJqKJRVmHniyg0flroHRvJ_FP5YDknIScSCl9ONQ.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-Nz9zFRqOjv7t_AXaNkNP_YlqGgKNNsQLZPXMQBcmVqQ.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-rACI_p8S3xbe9JeZtbHHZPDnr9Frm7zwMmJFftin1Sk.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-gFQnikc4WJMOePZ5ejmUuyM6wOmWkILgeCivxaWTo8Y.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-nxfrH8DYXkzwv9EQex8ntVnvb5UZX_WrXlUofJCoiJQ.jar
    Oct 06, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-iihoQTMBj1HiXH1kOLNywosXVRNOYOZmCJpIM9jmmj8.jar
    Oct 06, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 06, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 06, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 06, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 06, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 06, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 06, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96275 bytes, hash 681107a88bc013982ef0dc4e9d45cc55c6b8cc07d94d1e8ef19775da73a92390> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-aBEHqIvAE5gu8NxOnUXMVca4zAfZTR6O8Zd12nOpI5A.pb
    Oct 06, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 06, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-05_23_45_17-2918160370820696568?project=apache-beam-testing
    Oct 06, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-05_23_45_17-2918160370820696568
    Oct 06, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-05_23_45_17-2918160370820696568
    Oct 06, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-06T06:45:17.567Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 06, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:45:27.251Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 06, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:45:27.919Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 06, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:45:27.962Z: Expanding GroupByKey operations into optimizable parts.
    Oct 06, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:45:27.999Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 06, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:45:28.067Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 06, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:45:28.098Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 06, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:45:28.127Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 06, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:45:28.201Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 06, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:45:28.489Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 06, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:45:28.552Z: Starting 5 workers in us-central1-f...
    Oct 06, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:45:54.218Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 06, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:45:54.239Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 06, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:45:55.566Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 06, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:46:12.396Z: Workers have started successfully.
    Oct 06, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:46:12.420Z: Workers have started successfully.
    Oct 06, 2020 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:46:40.831Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 06, 2020 6:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:46:46.613Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 06, 2020 6:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:46:46.765Z: Cleaning up.
    Oct 06, 2020 6:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:46:46.837Z: Stopping worker pool...
    Oct 06, 2020 6:47:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:47:39.716Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 06, 2020 6:47:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T06:47:39.760Z: Worker pool stopped.
    Oct 06, 2020 6:47:44 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-05_23_45_17-2918160370820696568 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8a9cfaad-cb1f-40ea-b4c4-69507db09a68 and timestamp: 2020-10-06T06:47:44.747000000Z:
                     Metric:                    Value:
                   read_time                    14.246
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 06, 2020 6:47:45 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 41.331 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 28s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/fjyyyptwspkns

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1080

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1080/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-10670] Make Read use SDF by default. Override in runners.

[Luke Cwik] fixup! Fix unit test failures that were missed.

[Luke Cwik] [BEAM-10670] Don't start/finish bundles when there are no timers that

[Chad Dombrova] tox: allow apps that support color to use it.

[Chad Dombrova] [BEAM-7746] Fix typing in beam.utils

[Chad Dombrova] [BEAM-7746] Get mypy passing on runners.worker

[Luke Cwik] fixup! Fix spotbugs/checkstyle warning

[noreply] [BEAM-9682] Move existing Go SDK katas fixed time window lesson into


------------------------------------------
[...truncated 282.16 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 06, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 06, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 06, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 06, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 06, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 06, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 06, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 06, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 06, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 06, 2020 12:45:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 06, 2020 12:45:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 06, 2020 12:45:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 06, 2020 12:45:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-zScDfdqJoAZlSXNl9bIow3bT46NpKuPfp2BTC9wRXhA.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-dE0Cg4_G7UngOQ70jLC51Zn5dokn36Q9D_yHQWV4Tvc.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-Izhze20TkSbC90_imcT9lXMJjAM-M-JKPx27YfkxUs4.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-kWT_B9VLoZ_gbkrkbKcZUbkxQNt75-HBiGStnMV54r8.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-1YKYjCGXsxGi7AXYdxLyQh5PxXIbfi-uS2RRehmF1rQ.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-m-aVB8Dvoq3koZcwGlSFv5GUBhFUk5XgNln_d-Q2W8g.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-76j69XFcKRdvhufao4qxVDJ7-mvYFcp06CwGuO-ss5M.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5319670257160754304.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-8PRlcBp7wV_uL29KAKf20CmoPZiYtxZItCv592AKM9o.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Qh76OvNwgNsYGPQ26ER-fURlP8r2_zJqml99jD2wC5o.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-NE6HSeCpTQXB3qPApJDQDKec-LAC-441qDhvhy1kBlg.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-Tl4pGnvotmXhi-fDnooJb0L45I3uCC1SqqxyWJqiShU.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT--qSUXvpWBxESPXCnFlaC51CKq130wRkCcVy6X8C93IM.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-XrWe1wqmYbmAKipiA9VnTExYabZ7e5SBgfkGf5GO2cc.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-JrMoft5vs5euR0SUrydri1IE_b4U_WJUulLsYYYGHp4.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-zScDfdqJoAZlSXNl9bIow3bT46NpKuPfp2BTC9wRXhA.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT--XR897rjPbnmx4FAAhw0XvBQAEIFa7HMetBYnoTELgY.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-wrXQlet1lMP1f-JatR8valk6A6fyjyvL9TEqWRtO7mw.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-dgE5Lq4XIeQ4V9Nt-MedjDCp7IqtUMTzdRnusfkHxG4.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-KnU_4UQkey7WSB1ntoVB6joAVHwAt1mzYPVBcxwpODg.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-R5OFJkeWwwTTLxZyJCmlMagjkdBTe5IH2SHEObUYors.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-dX-_JUjFFQokhcquLrPezYOA1XgVGMSow_uxFGdf_TY.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-jxZ7n-21TK1vCNGpz6UDaRgjVLr_qKQLzK8XtYvGqlc.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-p_ceVyGa_07mHXkAmR-c9DR5Yg2AVlY-I0eDHrHNE4I.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-Q3NvKREh9bjTTqNLAIgyUwbfTUIdsSw55njzY5RswxY.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-UraYqapTV9Zukm-nO1ui9doqqQMoRFwW7Brq_SkNbvA.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-EiXoWuvjMWom7idmiIyxpsLrygNdQwxQBd7jyzBU-OQ.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-mwuuou8mPDsdr-08K-Hc5GfmsuMfyfbPxAfhr9au2ko.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-ztsMELKOdUCBG6wiFpgIieQq3NSV_Gge5iOX3eB-Ric.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-vPqWTtLkmWJKCa4uWPZHQa52opcBqPgElZq2iE32DjM.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-RX3u4DLI54nmwx620wxgWbnaGuxqoWQHI7UVTzBSH_w.jar
    Oct 06, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-74Rd8IW1j1wme7nxA9aY1QjVtbH9fRMdme_owOb0ctM.jar
    Oct 06, 2020 12:46:02 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 06, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 06, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 06, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 06, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 06, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 06, 2020 12:46:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <96274 bytes, hash e05b7dd3137cf2b5541c8c408614527895198ebb466cfa5d8788897fee69e45d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-4Ft90xN88rVUHIxAhhRSeJUZjrtGbPpdh4iJf-5p5F0.pb
    Oct 06, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 06, 2020 12:46:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-05_17_46_02-8899907849741019558?project=apache-beam-testing
    Oct 06, 2020 12:46:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-05_17_46_02-8899907849741019558
    Oct 06, 2020 12:46:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-05_17_46_02-8899907849741019558
    Oct 06, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-06T00:46:02.999Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 06, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:11.083Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 06, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:12.919Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 06, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:12.961Z: Expanding GroupByKey operations into optimizable parts.
    Oct 06, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:12.991Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 06, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:13.068Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 06, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:13.105Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 06, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:13.149Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 06, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:13.184Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 06, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:13.551Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 06, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:13.625Z: Starting 5 workers in us-central1-c...
    Oct 06, 2020 12:46:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:37.331Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 06, 2020 12:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:39.058Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 06, 2020 12:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:39.085Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 06, 2020 12:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:44.401Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 06, 2020 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:58.503Z: Workers have started successfully.
    Oct 06, 2020 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:46:58.538Z: Workers have started successfully.
    Oct 06, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:47:34.453Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 06, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:47:34.595Z: Cleaning up.
    Oct 06, 2020 12:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:47:34.698Z: Stopping worker pool...
    Oct 06, 2020 12:48:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:48:28.730Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 06, 2020 12:48:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-06T00:48:28.774Z: Worker pool stopped.
    Oct 06, 2020 12:48:33 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-05_17_46_02-8899907849741019558 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e4b2d3ae-f22c-470c-960d-ff5f580560df and timestamp: 2020-10-06T00:48:34.008000000Z:
                     Metric:                    Value:
                   read_time                    15.156
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 06, 2020 12:48:34 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 55.184 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 18s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/xjzto46fcoeuy

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1079

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1079/display/redirect?page=changes>

Changes:

[rarokni] [BEAM-10124] Add ContextualTextIO as a copy of TextIO

[rarokni] [BEAM-10124] Change access modifiers.

[rarokni] [BEAM-10124] Add ContextualTextIO Implementation

[rarokni] [BEAM-10124] Refactor Code and Add option for recordOffset

[rarokni] [BEAM-10124] Refactor requested changes

[rarokni] [BEAM-10124] Refactor requested changes

[rarokni] [BEAM-10124] Change from RecordWithMetadata To Row

[rarokni] Fixup

[noreply] Update sdks/java/io/contextual-text-io/build.gradle

[rarokni] Update

[Robert Bradshaw] [BEAM-9730] Ignore bundle registration requests.

[Robert Bradshaw] Bound the size of the process bundle descriptor cache.

[rarokni] [BEAM-10124] Change default for RecordNum to be false.

[noreply] [BEAM-7463] Fix BQ IT flake with streaming inserts (#12951)

[noreply] Minor fixes to the get-started/wordcount-example webpage. (#12913)

[noreply] [BEAM-10848] Initialize gauge to 0 (#12992)


------------------------------------------
[...truncated 280.82 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 05, 2020 6:45:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 05, 2020 6:45:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 05, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 05, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 05, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 05, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 05, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 05, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 05, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 05, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 05, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 05, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 05, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 05, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 05, 2020 6:45:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 05, 2020 6:45:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-SNB19SCicNmQDRyOF5o57DWpC-tDkx2OK3SkQill-ko.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-jZqlyBvjElEOGdVnSfYHIITzMEEIXokmSr1QjTqnxd4.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT--OjhFtfQMZGnYTOHp1SglbTM_r03nzFH0URGgyBpUg0.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-CjDTX6UcpUUdVSzrDrGm65YOi6s1qLJdTA-Qa-2pZVQ.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-lM1087iC1NNIjsnfcWQIuzGGftDoppFf_3VtzbQtO-k.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-_-sd2KhbKScnDemhqp0eicdnx5PTNJz8oohcet7QoL0.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-_NjmMj3tJO3oT-sRgBNPchNA5it2xUPwE1EbAHqXRfI.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-X-43ie_ykmibhJhDVWvNDtS9y_gCxtR56wxIiN7US7Q.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-BUa51mVqdCG-gtknZxjp_dLg_WW1gZ7kj1LlfKdAGL0.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3372886827757771829.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-BbpnL0c9i0ft_IDaDUwZpJNw4guv6vK6isZXUaG44S4.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-oj8WqYa79VSbDPPX_rZv6KGpfsQYNvupCDv0pfFmUlk.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-rUXSsST3xVovnyloYxh1xoMtSjtwcPCB9F78VKNXp1E.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-PhemHRGFI22bnEtMJp2MbN6xK_WknfX3QiAx4qaFfDI.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-zHnnqmWkQ31rUxBWMppuq07dJMeIH-qh2y60o6W2sZ4.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-UoFSu7TueM_y001VfGu8rkjswobFSk_333dfnZrpWKc.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-px3L5y5kKPv4h1LcUHZ6joAKw1R5Eh3aY4GI3BLcD1A.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-WWMdS0_tlpGkcxrfYjIxpLcQh79tzeS6WEEMC3VyOIQ.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-aYLX1cMTMR3ywyn68_kdt-zqkPB93cp1V6rq5oEHano.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-keIGroyV5lBrGhw7R6beszR3q4S-yW5AklqFtEvptlw.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-SNB19SCicNmQDRyOF5o57DWpC-tDkx2OK3SkQill-ko.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-8aNd6JqaCWK4WWJzK1-fFhTIHg6Mv0cSM483rwSYoU0.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-V8FKdw2pdnvUOR6GcnM8grqPSevtCSLAUU1IS1Zychw.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-HUBNuviYjEDAT8085yeivB7BwXLEA-skGB4zB7_AkZE.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-N02B9HfiPpdoU_-6Z-qq-X87dTXu9bTkN7ciu2-FYYI.jar
    Oct 05, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-pzqyS5Sx0vVCtx6fERjvDb-WUBM4sxznB8T8b-8_njo.jar
    Oct 05, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-g-d9UBl8b0bf2mix1GudKRm76VVb2V-q3RlUF1YTDGg.jar
    Oct 05, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-KAg5VTMKG6DmYGyJhLqZpYUV1m_YbWy8eIpsWO9UHGA.jar
    Oct 05, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-nJETk7Z0AKID7FBE_ZGCBtTkVsPZ48HrJ0sNekHGUb4.jar
    Oct 05, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-mamwtKgFj8jgNGHe-a8tJb_rA0grkJiVf3TjVejXLMU.jar
    Oct 05, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-SAuTO9g_SY5oQo-Bd5gluCCtHNrRQ0HX-E1kAzrP_CU.jar
    Oct 05, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-A_MWZyK1Sp0KD4F8-9eeeHI1kRJL1bGC9XMSraBIIyw.jar
    Oct 05, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 2 seconds
    Oct 05, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 05, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 05, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 05, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 05, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 05, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash 474be75780d333aa2d44cc6d7f716de5fef90dc70ddb864ddc3b01a778ab9794> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-R0vnV4DTM6otRMxtf3Ft5f75DccN24ZN3DsBp3irl5Q.pb
    Oct 05, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 05, 2020 6:46:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-05_11_46_01-7668239147419096228?project=apache-beam-testing
    Oct 05, 2020 6:46:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-05_11_46_01-7668239147419096228
    Oct 05, 2020 6:46:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-05_11_46_01-7668239147419096228
    Oct 05, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-05T18:46:01.916Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 05, 2020 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:10.297Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 05, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:11.053Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 05, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:11.095Z: Expanding GroupByKey operations into optimizable parts.
    Oct 05, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:11.124Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 05, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:11.191Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 05, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:11.232Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 05, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:11.262Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 05, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:11.291Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 05, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:11.734Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 05, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:11.818Z: Starting 5 workers in us-central1-c...
    Oct 05, 2020 6:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:38.769Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 05, 2020 6:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:38.807Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 05, 2020 6:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:44.237Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 05, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:44.667Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 05, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:56.230Z: Workers have started successfully.
    Oct 05, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:46:56.261Z: Workers have started successfully.
    Oct 05, 2020 6:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:47:34.563Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 05, 2020 6:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:47:34.705Z: Cleaning up.
    Oct 05, 2020 6:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:47:34.844Z: Stopping worker pool...
    Oct 05, 2020 6:48:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:48:27.346Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 05, 2020 6:48:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T18:48:27.399Z: Worker pool stopped.
    Oct 05, 2020 6:48:33 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-05_11_46_01-7668239147419096228 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 314e2ea7-f59f-4fec-9ac9-d9137b720f8c and timestamp: 2020-10-05T18:48:33.591000000Z:
                     Metric:                    Value:
                   read_time                    17.411
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 05, 2020 6:48:34 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 48.34 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 16s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/v4dfie7yit4cm

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1078

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1078/display/redirect?page=changes>

Changes:

[noreply] [BEAM-7505] Create a dashboard for SideInput Load tests (#13007)


------------------------------------------
[...truncated 272.26 KB...]
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 05, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 05, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 05, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 05, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 05, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 05, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 05, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 05, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 05, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-oCuyzo66FsnweQuR0r-Hmp2ZUPm4IM1E6f5qZ3lV10E.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-MfBKoHOaRt1BhGQNdsLfxPIjBB-h_F_x3u7pGxJLbqk.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-pc0Cl1XUf9b2Jhp2Z0MI0Pggf1av8wI6msKVyqdOSW0.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-LJ_WGlbIwSbtXBBET0DzCHdzqpvCXQNbKLzk6_68kxM.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-7UlMv7BhVgMdReIaNpaJlXTeC6Q0A9U5NBuecOed5c4.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-5jOB6cha_6FS_8LdzA-L_lYpEHPyQnrcpO3d8IH4Uc8.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-a1JhzNZDPq9rzEjEFR4tuJejnrLqqVsK3yxzNvOyJLM.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-_HWwDX_kxfa2c7zFPdqBA5LxYgJukQJc1qOTtumhUuA.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-GF4W3IIlPTGv7_z43u8emVcYiGaE4Ziu8abblYevwFA.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-KY_U-9cI53nLuIxiUu3qNBrMiM6lND2LM3bZbkxKNug.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-QEr4BLYvUfcyr6ly_jMH7Dzh-90k5mi1mB0bDzgpGuQ.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-c17pM1NMwD2DlYdXk8-J42FsTKGwW0bUhIP5d4c4HWI.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-br7qbpeW_avjmu-XQbVesG3rNBtkElUSbpNydZv_FO4.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-KujGN_s5FgNQ9Y567kAIswZjKDliu-v528kHzVC4IhE.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-HJ3qJwoYrEiAMvG-Sea3afuGRQra4rfmfguZxxwO0D0.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-kb5NkNtuDocd2F4wHXuKPEDyJ2oN64LYGRHU92shC8w.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-oCuyzo66FsnweQuR0r-Hmp2ZUPm4IM1E6f5qZ3lV10E.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-SLsBpWaiezuyfIrCa1-Q4_aeDMOo24Stg25BFF1qIiE.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-LTIrzJLvMA3vtexbQTH-h6gCS408fIORQsFaGdygnME.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-tMrDk2il0YNWvJ09k1NvZbsCer2RVXCWM6kcY9VAny0.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-AmlVGQ5ovOW1fNPNimskMluVcxbBVaq7qIqmT4Unado.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test657381252747374269.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-5F1MR7KI33YkIt3hhz9ybMkXWnNTgm9k17LQecyIZM8.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-VzmpJFpG987vm9i6EZAuQRTKOG3_N27OtQSDlDw-ux4.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-jb_MTWKtGxKoo4CYbdNOv5lQ__FGGlBfyg0JXSB00RM.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-31Yz4PHnaSgib8tlnpYPASzkQRWrFoWNGXT6OnuReT4.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-_oAdjeBbl-Lj8UUZvsmGyJ9AjMA0nhUhZfubvT6HN5U.jar
    Oct 05, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-OP2K37eTNHVwqnAiDOW66uqWZjW5GgLaOHdX1PPOt6Q.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-ALxQwFjgYxs0IF5x-3lXhIFpFjDeoplZeXsh-V3wldA.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-P-3chtnyAfGqvIu1EyJhfHeyMcuoL4zt3dAv6_UQS5w.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-4aTzf6C4widQlTr_ElTIA6Hqf22TCsiNzfxh5dLd42g.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-2M7bsaQvyO7aKWcdk-dYYfPLoGIYzaDzTuc1uuk_9eU.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/commons-compiler/3.0.11/f2a6ec7fbc929c9fc87ff8bb486c0574951c5b09/commons-compiler-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-compiler-3.0.11-DxpPXyZccBoxkzJErnBF_O8YtPpZUEF-Je5wvlDd2s8.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/janino/3.0.11/e699e368095379ba0402ea1780a87fcaea16e68f/janino-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/janino-3.0.11-kje3HSMpGA5ZIQ6aqhAO4xNFTvCuWIYIx1yxkxlZG-E.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.mongo/2.2.0/781d14f4e3d9eeb0b4c3e00a4ec165a04b3dc5c4/de.flapdoodle.embed.mongo-2.2.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.mongo-2.2.0-vNy3lJC0jW9u4Cy1AHsqSbjRUqOTX9ycpEmHkht7vvk.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-csv/1.8/37ca9a9aa2d4be2599e55506a6d3170dd7a3df4/commons-csv-1.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-csv-1.8-qL1WZS7UZo2dWjOZSuUvWbnjnI6w68tmhOaK7udXmmE.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.process/2.1.2/986b38302fa10018d5baccee7f655c31ee9afe5b/de.flapdoodle.embed.process-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.process-2.1.2-OasY7D5KRAimcZcWcjFwgi8Qb4B-iff1FfrVeNSih6A.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-calcite-1_20_0/0.1/6d16a59dc771784789116607a04acd9a0ef91d16/beam-vendor-calcite-1_20_0-0.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-calcite-1_20_0-0.1-1NrX_9FNKiEqNk5qBOaRlj-IwqOvKvQIGIbTVgm_v8Y.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4-runtime/4.7/30b13b7efc55b7feea667691509cf59902375001/antlr4-runtime-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-runtime-4.7-KmGUP4A7vR0OAt_9GbkqQY-DNAyZQ0aAnjtR4iMapsA.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/ST4/4.0.8/a1c55e974f8a94d78e2348fa6ff63f4fa1fae64/ST4-4.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/ST4-4.0.8-WMqrxAyfdLC1mT_YaOD2SlDAdZCU5qJRqq-tmO38ejs.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.mongodb/mongo-java-driver/3.9.1/d313237180bf9f2f82e12f503d9617e6b070f792/mongo-java-driver-3.9.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/mongo-java-driver-3.9.1-mxKxkvmYluxV-Hdn57uyt-MjjSQUsFjxFw9tjhx0bm4.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.abego.treelayout/org.abego.treelayout.core/1.0.3/457216e8e6578099ae63667bb1e4439235892028/org.abego.treelayout.core-1.0.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/org.abego.treelayout.core-1.0.3--l4xOVw5wufUasoPgfcgYJMWB7L6Qb02A46yy2-5MyY.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4/4.7/cd6df46532bccabd8127c18c9ca5ef481962e931/antlr4-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-4.7-eGclcCizNzrwEd7nts6bWHqP1cegsl9osv9MuQvoqgc.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.glassfish/javax.json/1.0.4/3178f73569fd7a1e5ffc464e680f7a8cc784b85a/javax.json-1.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/javax.json-1.0.4-Dh3sQKHt6WWUElHtqWiu7gUsxPUDeLwxbMSOgVm9vrQ.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.ibm.icu/icu4j/58.2/db9fd4b4c189cf1518db14c67d14a2cfcfbe59f6/icu4j-58.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/icu4j-58.2-lT4eg7K-fD6i-I2obBNhT0fp5x01eMhSHX8Yd1a2OWI.jar
    Oct 05, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr-runtime/3.5.2/cd9cd41361c155f3af0f653009dcecb08d8b4afd/antlr-runtime-3.5.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr-runtime-3.5.2-zj_I7LEPOemjzdy7LONQ0nLZzT0LHhjm_nPDuTichzQ.jar
    Oct 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 176 files cached, 44 files newly uploaded in 1 seconds
    Oct 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 05, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95917 bytes, hash 4b2b3eadd89d999df38a1789feb7975d98a69b330c650912b2e01842afb1e237> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Sys-rdidmZ3ziheJ_reXXZimmzMMZQkSsuAYQq-x4jc.pb
    Oct 05, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 05, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-05_05_45_24-17989017567292138772?project=apache-beam-testing
    Oct 05, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-05_05_45_24-17989017567292138772
    Oct 05, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-05_05_45_24-17989017567292138772
    Oct 05, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-05T12:45:24.139Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 05, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:45:31.223Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 05, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:45:32.012Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 05, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:45:32.058Z: Expanding GroupByKey operations into optimizable parts.
    Oct 05, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:45:32.110Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 05, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:45:32.190Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 05, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:45:32.219Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 05, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:45:32.257Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 05, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:45:32.291Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 05, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:45:32.730Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 05, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:45:32.808Z: Starting 5 workers in us-central1-f...
    Oct 05, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:45:54.692Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 05, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:46:00.616Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 05, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:46:19.852Z: Workers have started successfully.
    Oct 05, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:46:19.885Z: Workers have started successfully.
    Oct 05, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:46:51.947Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 05, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:46:52.092Z: Cleaning up.
    Oct 05, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:46:52.164Z: Stopping worker pool...
    Oct 05, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:47:40.707Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 05, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T12:47:40.760Z: Worker pool stopped.
    Oct 05, 2020 12:47:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-05_05_45_24-17989017567292138772 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c992ca21-657b-416c-90b8-1f6253720954 and timestamp: 2020-10-05T12:47:53.638000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.736

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 05, 2020 12:47:54 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 44.898 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 36s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/pafsbr7xgctg6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1077

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1077/display/redirect>

Changes:


------------------------------------------
[...truncated 270.80 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 05, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 05, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 05, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 05, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 05, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 05, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 05, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-iSq9P0MZvCbSwXpxV4RQQHR4l8a_Q-itxiktaMTEWMY.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-QVAdUYPgmOSrpILtqagLCFGcws1KtRqjZjGZHiwKnGI.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-QBUENr3TcJdV_JOWmLlWsS8Yz6-tcIWpSCeHYDMwYvc.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-2ORaOjBZ971H9U1Mmo1m2CYAYSI8Y4tNgfBFA95gusQ.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-fGf5yVKHi5gZgqRY0Jd_0TC0KbEC_-8J2yZoVJw6_5U.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-158Gw1-iNuT5HcTiA1dodlnd4gkToiT3GE5iYEicweM.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-fi8pLKnV_naulrl5jWmtWMAXwMQwuTRdRq59qrJDn_c.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-CzK2VrUMTOpCqcg4TqQQq6Dz7Z8Fma8fXbV7hjvimDQ.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sok9tx9b3OxqS4dqadzZdr2xoE8twrzBGy_kc0BzOGs.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-yzCCgr9qDJ-0_s8ahGDGPlGiXkvNQ0dCv66rG8y-o-I.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5182527868502814166.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-hnkr33pRLdyOaRNx14OE7ffg9l_zXvRDx3o7SCy1Iwg.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-w8nrcgZCvaSWPzwWwCL-DFfd_VIy7Pe7tMCkJ_TAdn8.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sok9tx9b3OxqS4dqadzZdr2xoE8twrzBGy_kc0BzOGs.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-ELvGRkOGnhp_Cy76f0Redx2BIMcwE1K4MkwOyDNyA9Y.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-CiMHC3bhMmpYm9YZayzriWG5LY-NYGoNDx1SvUzWiTM.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-bnId0rSePU132MFnKGSw75cTWewbsZ0SLHv6zgf-Bq0.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-dsvKd-uqY5DDjQD7qbUlgCuT9Dw9-kxDNwsJfUG5Exw.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-C9dMq7TgB2bBnzYmP4k9y-rHwZI2ELHedvEwdnuAPJs.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-RTej-ILpSQaoytzvFEgFnLlpryeCf8k4JRRyvzpprRg.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-ZDRW0EEsR5nvDJlDiL5fXSI_xIvQ3tYHWF_Bu8D1fWU.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-VrgJ0s4EPsVIM3OYMXCrwdIb_GsFDC0LbPJ0DgIHvDM.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-NaQNBirlew5cYlYU5mizGQqSWVCIpJm3uLYsTmyIyvo.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-ZQNshWh3UDSt-stROfw9gv6qNSiRZ7ealPQJe_p7HKQ.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-n0raILofl31n8uEau4Q0GTeW3Ca2qcDnnKvAoPM0BUI.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-ZYjWl94oC2AlaSGGltmcp5d_UIjXRldMMnvlKQ-hhCs.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT--ZdjE1qPr4aQ2KXp3N-2EENB9S_adrsZRRQyQlmgeH0.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-_RyR97CIk_gN6wVudh0wWACJEq9ClkxboFAkNFhoEPU.jar
    Oct 05, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests--_5uNSXa654QJaKT0LUtUhG-5qP4BOmsLv1hPNxdJNE.jar
    Oct 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-7Ld2zrsoPjITHXW6XZKjCUnlxHF4d9cPXkplqARRH5I.jar
    Oct 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-3ywtZ5ntPWxYpp3d-WWPvfrCAAchipJT6tac5HFyMEA.jar
    Oct 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-N1XXUP-qPkIFbJ4zhNH7pcJF6iZc5hTOG6G0Qp64kTg.jar
    Oct 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash bbd00b7c4680009033f83d4c6fc37a0b988e28f8be1ec265edf5f48d4f77de37> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-u9ALfEaAAJAz-D1Mb8N6C5iOKPi-HsJl7fX0jU933jc.pb
    Oct 05, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 05, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-04_23_45_17-7466248095752740026?project=apache-beam-testing
    Oct 05, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-04_23_45_17-7466248095752740026
    Oct 05, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-04_23_45_17-7466248095752740026
    Oct 05, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-05T06:45:17.931Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 05, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:45:25.263Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 05, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:45:25.990Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 05, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:45:26.069Z: Expanding GroupByKey operations into optimizable parts.
    Oct 05, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:45:26.105Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 05, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:45:26.192Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 05, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:45:26.219Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 05, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:45:26.253Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 05, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:45:26.287Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 05, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:45:26.701Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 05, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:45:26.781Z: Starting 5 workers in us-central1-f...
    Oct 05, 2020 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:45:47.061Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 05, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:45:49.494Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 05, 2020 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:46:10.031Z: Workers have started successfully.
    Oct 05, 2020 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:46:10.058Z: Workers have started successfully.
    Oct 05, 2020 6:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:46:43.058Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 05, 2020 6:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:46:43.315Z: Cleaning up.
    Oct 05, 2020 6:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:46:43.387Z: Stopping worker pool...
    Oct 05, 2020 6:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:47:35.907Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 05, 2020 6:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T06:47:35.947Z: Worker pool stopped.
    Oct 05, 2020 6:47:42 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-04_23_45_17-7466248095752740026 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): caf38787-4887-47ef-9485-0dc851583a94 and timestamp: 2020-10-05T06:47:42.277000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     12.67

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 05, 2020 6:47:42 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 38.234 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 25s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/keuhsdlomedfy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1076

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1076/display/redirect>

Changes:


------------------------------------------
[...truncated 272.86 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 05, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 05, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 05, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 05, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 05, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 05, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 05, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 05, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 05, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 05, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 05, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 05, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 05, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 05, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 05, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 05, 2020 12:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-dwHuOzONCWuocMYGOa5SW6RT0qnY-jWZHmRQmPwWNAI.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-c3CLtjgIFpkA5ui4SdxCUQof2WPYc7EuKap63wKDQGc.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-1weCow_Bhw-NSeHaQfvtQ5UYM3HGsOZcLag6CIJVGrs.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-xt2I148Oo4PCV0hAs6ollq95mLsWlvE_10O3-ka74WY.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-9WcX7tk6UzbZD-V-_zEFz3yut8YU_1P5SVZ7d8adyZI.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-lraa15KiVS4TsmCyoetFk5Kt_BW7uMlgdNhrqBUB5mw.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-DYV77Qvqf5bh6BAKS9RTojxZTm-1WiBTTL9VefmvkwQ.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-bWRApYACbU0dJuYPH9NcG7txT1e29eG9_GgntjU_LsY.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-mSzW3IMT5Mx8lHzAyTuCBYA0HBBfBkQmJNGIosRwUlw.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-DAk2uF5nA4zqgl99pj3s-LPiGdZpvPtEK16Z3zH_HpY.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-b0ZVzP6rQqRggZpZ7jPPq3YtW5yGeBYbqrMNbe1nIfY.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-h_MW94q-Mk0TLizN0K45kqfF0xhh4FQUzATc-0rc72w.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-2zs-K7WpDkzVpKUleAmI-qacBU-hwe09bGEkZQGzOJg.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-N1--Fs8yZO9qWyDb1AaBa6N7MDpDwEMB_cVSLtEKgms.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3152701569011119890.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Pf8paCf8TEgCzA7pm-Mp3rK86XwcQ5-IMOEkYhaRc24.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-D4ZuEYAn2NgAiCHtxFLjmL2PpEI5xMM-D8m8gGYJ8pE.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-dR1pMDLTdwm7OzfdyT59ckE8hBpOeCQequRHe8YvEY4.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-zqCEwgWvib6RgubJ-Ll8AXtFvznafnqDGGcqo3h9ZxM.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-nKYmkBcAuSNAnH9Kjlm1Bs6-7Nx22hwcuh4oSapZtcY.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-rGScDbeazfegK5gNyFdI3WsBz3z6YmxNR8XPU4p9UWo.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-Ib8ZPz6SrpIV_ADW_1vAzzMANFhncjt2o5YlIdQ2GDE.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-xnx_SXC4kbO2YFdGpFAH45LjaJTjYNyK5mpuLrV35ck.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-8XLrL7Ztt3LTlnmwauDSoSThpwXR8lekWP7NeEeTz7g.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-2zs-K7WpDkzVpKUleAmI-qacBU-hwe09bGEkZQGzOJg.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-lavihZaGBr8sHgSWs61I9UXDiGUgKTUKH0X8pFMqwwc.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tNcip9UTaBj5eki7KIK66_VRBnjpqJEuB52HBd9NNR4.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-wz9iAbBH2A3ZgegZpIts2oIu6PH4fzDa0H3ooHmMZnM.jar
    Oct 05, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-6C0zgw4uYkVdXbPYWPg8WTDpVdWP0Hi7WOOWgPHzvT8.jar
    Oct 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-SJ6Fhkcun1z1Wregoy-DVaT2pvOYoqF8Q8E9nZwePWo.jar
    Oct 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-q_W0D5x0DYLPaa3wQJBdxAgUKvgtNpPPcTqSCGc6VKs.jar
    Oct 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-u7h_1qoOOLPdQgy_7RWZv2HsyvAaL4x2-xUkv_GL650.jar
    Oct 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 05, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95919 bytes, hash 8adba20edf1b13bec830cf8b59097eabf7ec89dbc82fdefa8f84dfb3b9599ed5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ituiDt8bE77IMM-LWQl-q_fsidvIL976j4Tfs7lZntU.pb
    Oct 05, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 05, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-04_17_45_20-16101284726545935877?project=apache-beam-testing
    Oct 05, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-04_17_45_20-16101284726545935877
    Oct 05, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-04_17_45_20-16101284726545935877
    Oct 05, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-05T00:45:20.232Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 05, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:45:28.391Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 05, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:45:28.966Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 05, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:45:29.005Z: Expanding GroupByKey operations into optimizable parts.
    Oct 05, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:45:29.056Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 05, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:45:29.127Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 05, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:45:29.155Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 05, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:45:29.179Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 05, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:45:29.200Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 05, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:45:29.538Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 05, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:45:29.614Z: Starting 5 workers in us-central1-f...
    Oct 05, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:45:49.931Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 05, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:45:56.294Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 05, 2020 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:45:56.322Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 05, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:46:14.998Z: Workers have started successfully.
    Oct 05, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:46:15.033Z: Workers have started successfully.
    Oct 05, 2020 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:46:17.078Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 05, 2020 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:46:50.786Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 05, 2020 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:46:50.934Z: Cleaning up.
    Oct 05, 2020 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:46:51.017Z: Stopping worker pool...
    Oct 05, 2020 12:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:47:36.959Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 05, 2020 12:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-05T00:47:37.005Z: Worker pool stopped.
    Oct 05, 2020 12:47:43 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-04_17_45_20-16101284726545935877 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c33139d3-1491-4960-85a7-7a8253bf213c and timestamp: 2020-10-05T00:47:43.341000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.279

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 05, 2020 12:47:43 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 36.928 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 27s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/ued6ap2a6a6xy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1075

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1075/display/redirect>

Changes:


------------------------------------------
[...truncated 272.50 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 04, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 04, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 04, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 04, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 04, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 04, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-m5qLogIz59qpvS53ijAoXg0GIasIka6NGpztcScdOpw.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-cIosAChvCtyxywR029x78ZiEmN2ZOy_1XLoq-6fiS48.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests--RsIyDlBcyFKHHOdyJjPbzJwCNvWQqqe5y8d6skLdUg.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-s4fkUEgvovD1AuEU1lz981FFFFQC_TSRxfcnMscNF2s.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-Qk8wVEB-r9ghVY-gMD0Qrme-8Y7eMioKsLtvGiQYQrs.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-MKuayLW_-AHfJpCkCAxmzt_BxJymXPezJS_ytC-ncS4.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-iusYfpQy2N3dgyyJ5PjFKgoB0Hc8tqOptYD-bCDa-WA.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-ublk3wDGsg8oMYgIu1oO00HVgfZSaAmU5bfTPPxNFXc.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-t129NyDzWry5LMHoKF8OpCvK40IabNxP09AdHna_PvI.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-FZQJuF_lQasSviH1IKIBzWqc2-6JH6A4YmRUnshz59g.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-LAOV3jrh8bgrl3g5_Uir10Pklqh70DQOXw6UUgnRCkY.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-EIShrYc5VROff0WLmAy5YZHnPI1mKIIHhW7xdD6qcnY.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-bKDHS6J5SCmSRqkQhiyYNjZB5jbBroSSZB1kZRc0l1w.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-Ql-zWrvVl5ri3OUM_H4hlY-0-M5lGBNIxY2ucoh5Gk4.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7226474627975684407.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-GmmyV3A9yJp6gaC48gUttodeu2DVLjT9mse4dWp7gRg.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-7omyiOIS1_cucGxuXR6kqJCwhpVMjGh1YSqn_fulRqU.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-m5qLogIz59qpvS53ijAoXg0GIasIka6NGpztcScdOpw.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Kziqr6xPogrJ4_W00aReFvD8SUN5rRV2eJdBicFqGN4.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-_Q64iexU7m36joZiZUA4KSH3vjSKY3bqUUW586YWFQo.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-stRBhHPpGM1x7okzBMR6NTD289ANn-eXzBzh-pK-IQc.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-TQWehOi0_zRdhVtHYAIW2y5RNI_Tj5SuktRsHR2Wuao.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-VVaq_wC2ioggH1S1LkPREwM8f1TkdjbN719CB4w3p20.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-xyiUPv0FE6gFrPQHWfcIhGQn4jMwRRbB7WoD63YEsDE.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-3YjAtbqJNf3x5Tgs2Fo2RWCOivm4fx4Te-u_649i_Kg.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-CBj7SeSGtHtaTHQLLKQ15W-cjmrJ2DjafBkdstfxxmo.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-B02qX5b65_zr4_WA6TPCia_97iALtN3va985dwEgy2o.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-5cVb7bGirUb-CGmHFw0e2nTwyEg-ABO2pTqUv7980qw.jar
    Oct 04, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-wxYLpf0vqTfu_M8bH_W8VPmf3PHb-QiVtzLIcWESGFQ.jar
    Oct 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-ucOQcFOlaSjNktvyTrOs0I5Chr5NkcIXXFKm8tS9-Yw.jar
    Oct 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-giPNiYKJG_8gGiLuBNjwDKRT-zUebHA7ue7bmlZ1ams.jar
    Oct 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-vSSD6ERBdMkVKB5SWH4tJkfnVTF6JGGH1unVl5QElTA.jar
    Oct 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 04, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash 33cff7859a4fb10370ea44167c561f3336172afe0954980a8ddbacd010ef7c98> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-M8_3hZpPsQNw6kQWfFYfMzYXKv4JVJgKjdus0BDvfJg.pb
    Oct 04, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 04, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-04_11_45_22-6330749841225498909?project=apache-beam-testing
    Oct 04, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-04_11_45_22-6330749841225498909
    Oct 04, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-04_11_45_22-6330749841225498909
    Oct 04, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-04T18:45:22.265Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 04, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:45:29.063Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 04, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:45:29.798Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 04, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:45:29.828Z: Expanding GroupByKey operations into optimizable parts.
    Oct 04, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:45:29.866Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 04, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:45:29.925Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 04, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:45:29.955Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 04, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:45:29.987Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 04, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:45:30.022Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 04, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:45:30.561Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 04, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:45:30.632Z: Starting 5 workers in us-central1-b...
    Oct 04, 2020 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:45:37.329Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 04, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:45:55.872Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 04, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:46:11.466Z: Workers have started successfully.
    Oct 04, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:46:11.501Z: Workers have started successfully.
    Oct 04, 2020 6:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:46:42.037Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 04, 2020 6:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:46:42.149Z: Cleaning up.
    Oct 04, 2020 6:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:46:42.203Z: Stopping worker pool...
    Oct 04, 2020 6:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:47:35.160Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 04, 2020 6:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T18:47:35.193Z: Worker pool stopped.
    Oct 04, 2020 6:47:42 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-04_11_45_22-6330749841225498909 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 31872e99-2653-439e-b348-5f36bc840c3d and timestamp: 2020-10-04T18:47:42.099000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.043

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 04, 2020 6:47:42 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 34.065 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 24s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/arkqtx3c4tx46

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1074

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1074/display/redirect>

Changes:


------------------------------------------
[...truncated 270.67 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 04, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 04, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 04, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 04, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 04, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 04, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 04, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 04, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 04, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 04, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 04, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 04, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-UdvSR9QVphuqHgCbZPnT7iBNbdMVzIWiH4QkQJuOALA.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-2LAVpDQmx1p-8yB2T1VdmU9ddaWKP04gCAys5XwlmcQ.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-7jvBXWURmJQbP64SpLQc_Gsj9gT8tRDgUrR1fsZkf0E.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-Uv_0xmXoQcmQPeCvsOTML5kYqDe8tkZPpFd83Cp4Rdg.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-SEUdbjVJ42LkDL4ev3-WeLtPOTjqBxi47IDBtiEvRio.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8188141483593557526.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-jI4qiUw5JJfdu7ge4ame0bbQu5ISS9vqiQuszcUeS6s.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-1YYvB2ugMk7BPh1s1rEMXuCQl9v-NvuaieZSQ8NOgbE.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-vLCRov0oseUeTHjkouzUf-_zwd-0H-w2U9r-RrMc-m0.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-B7_ImEfLbz8FflcvlZXWfGso40EFzt-1QHk4skp3Pcg.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-8iKEv8TPI9Ng__m5O8YHPr09YiwIwX9yEjGgLCbiG-U.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-lF79JynrWrxnP01mGuANT0fyry2OIyMVULd5CFBR9cM.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-TV0oj7sBBs5x-JzhmSNuog9zCjaqdL50mjJee_SHYxQ.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-bSdd8qOQm8EAEr3BKgMWt1968kufpUFlh5aWkaLVk2Y.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-EVK2vaflnn9_-B4bNPafSeNm46jWUz2coMTv2GiImyM.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-2LAVpDQmx1p-8yB2T1VdmU9ddaWKP04gCAys5XwlmcQ.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-MH7-vyqmazSqQNt8W7vK1i6F9yTp2EIUaSapdV3Tk7g.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-pxMigOpp5T3p-u1hd09IP8lmFe_FGSO2HNyUxD4QzTg.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-V5sXgR2iCSDlg1n8XfchP9RawIwipL0ON9F2j1Lxqds.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Fpos2ly_cAI1uCbzncwjvMzaAFArULVWPVR8SIVyRAA.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-d0-nSktey08pROLw-OgMpDmTdKlRT2uvEiWrb-I8lb8.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-nBEecAc6SMgeAKFwSMIc1tUmD22IMpZ7tT5bAz6PlWw.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-YrBtcz8YxbWP2Rqy_y4yS8leIFOf_qUtGTGYcOhw_Zc.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-evYcrhlzV-Um8C-OwwDCkypsAdb9avVPVJr9sCfTrKI.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-7mgU3gbgDm9o5URn4JVnqXktQinCyMRQfNCxGV0EsEU.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-1KwuVA3yrJ0DCWNzN292zTo0J5Mmz6C1lvZDbCiJCLI.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-AM9eBB_Pi9bWRq7BYixtQ1dqxSa5ktUN62hwAbCmpAc.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-ghqKsXRavZ-jf43Iw3Okd-YNx4hiZwQLhVV36ctabrU.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-b3YQG-2lIxaSJcBquwr9I397EoZxYhj9CW-sg-NNziQ.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-85x9RdzljbAZsbuZzrRpmQiYk9riQV8LMXoLDaEEn50.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-72JLBgtAKNbwJuxpmzWvLQCzuX5jo4NgEqWwDD9nYLs.jar
    Oct 04, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-1OTZlhQ0OepwuPDN8c422KRo4xGfKDpYnyajQYXcd1g.jar
    Oct 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash 521acbf01ed44bc37d135a4d01512eb0115e57b7648e26fde2481fd2450331c1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-UhrL8B7US8N9E1pNAVEusBFeV7dkjib94kgf0kUDMcE.pb
    Oct 04, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 04, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-04_05_45_20-14969852273036923819?project=apache-beam-testing
    Oct 04, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-04_05_45_20-14969852273036923819
    Oct 04, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-04_05_45_20-14969852273036923819
    Oct 04, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-04T12:45:20.922Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 04, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:45:27.831Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 04, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:45:28.460Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 04, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:45:28.490Z: Expanding GroupByKey operations into optimizable parts.
    Oct 04, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:45:28.521Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 04, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:45:28.586Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 04, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:45:28.625Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 04, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:45:28.661Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 04, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:45:28.703Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 04, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:45:29.135Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 04, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:45:29.217Z: Starting 5 workers in us-central1-f...
    Oct 04, 2020 12:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:45:44.406Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 04, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:46:01.175Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 04, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:46:19.000Z: Workers have started successfully.
    Oct 04, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:46:19.046Z: Workers have started successfully.
    Oct 04, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:46:51.617Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 04, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:46:51.811Z: Cleaning up.
    Oct 04, 2020 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:46:51.896Z: Stopping worker pool...
    Oct 04, 2020 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:47:48.635Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 04, 2020 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T12:47:48.674Z: Worker pool stopped.
    Oct 04, 2020 12:47:54 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-04_05_45_20-14969852273036923819 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f28d9960-4aa1-40f2-b71e-c13f45564b7d and timestamp: 2020-10-04T12:47:54.877000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    11.038

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 04, 2020 12:47:55 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 47.618 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 38s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/uuyljyybun4gw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1073

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1073/display/redirect>

Changes:


------------------------------------------
[...truncated 272.53 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 04, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 04, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 04, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 04, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 04, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 04, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 04, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 04, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 04, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 04, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 04, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 04, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-fjS01nuO4B868nKJZdzliCJ-oOuCv31rTau5QxOIAwY.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-i6bN3T55TCcztsVrjswxWnYxcin4aE2FOGV5kew3v_8.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-9hJN1rshHmPKmqp5yXiC6MERUrq6542gGq3BFa74pwc.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-xfDosjGmJHngxwiz1BfGOxwPkGCEjf5qAUFMh_W2DHQ.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-E76tWd12t7pCRGjKnxKvNFLvip6HTHqRLOcUv-vMdWI.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-fjS01nuO4B868nKJZdzliCJ-oOuCv31rTau5QxOIAwY.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-rE24gfq0BQSdbEfFKfmroxu7GM_9mBiS-U6l_UgsGNA.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-c-4xK8OXOksh3YbsHrdS8cR6GXSq2bFEU4ba_mQS1W0.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-uy6UJc_VeEIbh-PmTloSSxc2qa9JEcGuJ0Qxxb5yRK8.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-O46Fu5NB45up_33UlSo7BMemhPIOwgyBkqGF8sdNrQs.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-nZjHn1f_k1pJ6HAeuAz817BJ89vFxqVESwq05WdEW2g.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-XZB55Bqb6IVYN_yRAsFFjP48CxyFA960TeThv5m004E.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Bho88z5MseJLBANLB4JI_xaDKoVbn8BF17hmaQNe80Y.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-QSnr6fdPeQfH1vkiF8ENU38ylhtYEbRnMvsfKekYjCE.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-aqA2UalI-PLW3Y3qBhCgJkCA74VkQGzsMGutJLSgX2A.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-M2J1gL_Fshda_mS_7CXzm70nBDUZE8rSbTfnZK7ZvA0.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-3aHOv3S4SEcGwWvB9WM3OoWITnqGyIPO-UuuzMOmOUM.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-jBrwFXfCUUXqGlY0Rr4EMB0ECExe9BEXGPb-L2fLe6U.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6785317055426863842.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-kKBifuF78KlM_ZpCi412jvSQtf-MNvBdaVfOKkuBbM0.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-EhxxLS5Q1wQ8RozLYiq7MNi2REKrwsWewX7cAZImFok.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-pHy37aPumgoR1PH_xSvq5EFJ3jbXURlwHJFK5obhICA.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-KLmBYQ7--KxaHROhz6mcpLhGKxDzFXMr8NzxEj_9YKs.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-dyf7zhDV-3Q37iH3fQBCY4S5hqQZGDaL8t2PSBJJ-oM.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-gEU2wscQC3TJS4MFm4mSrceB6G6dch0oniUmrmGRmg4.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-MO70Ba_wknphFuEAtOr4lqabT5YiE9EyHf55hCDcRIU.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-nnnU52dsF0emnDbqlkuCO6UzhQP4kWbuUU-kuPWjoa0.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-JMVsPs2zfKPExC4KdSKTKwVXBEDCFHl1NMzdTrK4lxU.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-mNjLKznBpQcEWVRnU68ACN5tZ_g3EEevBJUwjFWPKm8.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-x7_OXJymRA1hYm2cxcZ85Czp0XW40pLnMsTXMnzsGwc.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-2E903VObP8pKtuXmna1g4zlKP_pqMLzaynYuqJfSI-g.jar
    Oct 04, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-QczjfWIv1rpKXtB2--9KLAGmbQNghRg69VNZZfIux3M.jar
    Oct 04, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 04, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 04, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 04, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 04, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 04, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 04, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash d717b01c6f7138e926e604db0ad884adccbb6b942e0dce97f7027cb34b974509> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-1xewHG9xOOkm5gTbCtiErcy7a5QuDc6X9wJ8s0uXRQk.pb
    Oct 04, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 04, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-03_23_45_18-12130109800357455786?project=apache-beam-testing
    Oct 04, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-03_23_45_18-12130109800357455786
    Oct 04, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-03_23_45_18-12130109800357455786
    Oct 04, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-04T06:45:18.877Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:45:24.854Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:45:25.586Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:45:25.627Z: Expanding GroupByKey operations into optimizable parts.
    Oct 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:45:25.658Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:45:25.808Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:45:25.854Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:45:25.889Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:45:25.920Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:45:26.319Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 04, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:45:26.392Z: Starting 5 workers in us-central1-f...
    Oct 04, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:45:46.969Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 04, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:45:53.708Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 04, 2020 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:46:09.389Z: Workers have started successfully.
    Oct 04, 2020 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:46:09.467Z: Workers have started successfully.
    Oct 04, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:46:46.039Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 04, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:46:46.182Z: Cleaning up.
    Oct 04, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:46:46.268Z: Stopping worker pool...
    Oct 04, 2020 6:47:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:47:39.516Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 04, 2020 6:47:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T06:47:39.562Z: Worker pool stopped.
    Oct 04, 2020 6:47:44 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-03_23_45_18-12130109800357455786 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3a4d6055-4cb1-4625-b368-2660f2b267c1 and timestamp: 2020-10-04T06:47:44.542000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.881

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 04, 2020 6:47:44 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 39.148 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 27s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/5htvpeui4dhik

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1072

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1072/display/redirect>

Changes:


------------------------------------------
[...truncated 272.24 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 04, 2020 12:46:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 04, 2020 12:46:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 12:46:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 04, 2020 12:46:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 04, 2020 12:46:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 12:46:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 04, 2020 12:46:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 04, 2020 12:46:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 04, 2020 12:46:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 12:46:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 04, 2020 12:46:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 04, 2020 12:46:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 04, 2020 12:46:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 04, 2020 12:46:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 04, 2020 12:46:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 04, 2020 12:46:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ET9a1p4PsHnSjzTXxpmo9kp2MNvhyA0FkwxvatyjcBI.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-XX3T1-AkZXArhucZyf_DltDiaGk84Geia4xUd0jX_uk.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-zw3rgNq-r5Zu0MANfUkx3KSVqgWkThCLsVGw9ugxGdQ.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-tx497Hh5tapNlcdECjO7FiHU44MFGbhXabkFB2_wKaM.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-3-4IGqX6MogqAqxxv5TvZybTfFHa0VbchON_Ff7CXCc.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-EtGroUUar5A0-kSupsL2Hj06eK0kusS7R8gMU0iEBCA.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-noaMpmUPWsRQkKfci0M_w5QSOCvrlNgrZR_vGGLiQ8g.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-plbHYSLaHhftVfFpaaKkXmR3gYVJAq3wTRulJ7y4gDI.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-164p_O0RMRUMTHttrNe6Gwuc9ipLXze8Pf-55psCvk0.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-OrQMZlV6Ls9Hlc8qqn_EUxXwM39cJnSn_2-nJQepWAM.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-te00aB4eh4mnHF7SrtUOnMKah2UiBPU-qt00Gr67dOk.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-p1mSH7_xPw2-z3WUFwZ8l1jaL0DwC-B6bk-70IME2QQ.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-a0v9CcnaJEBhV0_kTnvDcLRD5_1mMT7Py47HKBcgJt0.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ET9a1p4PsHnSjzTXxpmo9kp2MNvhyA0FkwxvatyjcBI.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-iQZtoG8MKgyMaMzojWQyrHZnQOBeo9-uYg8Jwr5rZls.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-efjOBB5KXt0yLQskcpGEtrBpyVzvGhCQFKTtujKYtu0.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-JkdevfNtGFEdTzz1iElnOcSrTGeQZPMueXBnjODJNeA.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-bv9b9tpyE_kdTAFYgWD7VTvhRQu2SRLwj3fbDXTfZfA.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-6AtoVAg_hcKiAsEzpPkFGxRauPnZDHq840vcDls5u-g.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-bPkMJHugTby7ITxgweu5ZjDb84q08oj807vkhUETBpY.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4323459607847385498.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-q99BCWzH8Ht3GmgZr8F0PtQp3O6_yeFnqVUCIV4xJ1s.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Vy_F_Gr72i0_YixTUvBQAy0tSm1pbEremTSVoDyD92Y.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-RLz6w2-LmjOA4PhNExilnOP0YbNQuvYQyn4Ay2bsk3I.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-O7_ZjZvdC1R7_2v091DUD5QMDQc9Uo6llX8EEdvtEBA.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-X_TesL6BqiopLFsSSc-R93MC6wqtTdKL1nyqZYHcpHc.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-qfzN5m-ksHxXs4K91vE9lij6iw7B5FbP9mmFvUWAvYI.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-vLi-rxp6x-bAwcqHCxKj-Irc2Q2HTNbGdFV56RWxaiY.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tazjk5_QIBXwXW0GcmcVrn8C07SaMmXOR3ls17_lQFo.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-S4f5meODgkHp8oahvaO7MqH6dQRItHgNMBoKX4NW4h0.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-iEcHTWMmLGYJ6SXVqMW9NZqORTH9aYDixqJJ26dSBJA.jar
    Oct 04, 2020 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-K2GPGoKe3GY07lkh_UZKaU3JbLpi4b6lrKnOJaRxMFs.jar
    Oct 04, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 04, 2020 12:46:09 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 04, 2020 12:46:09 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 04, 2020 12:46:09 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 04, 2020 12:46:09 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 04, 2020 12:46:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 04, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash f708518dbe39a5cb1e4786c6f93a4835d375316b1612d0eabba4417d37ca71e1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9whRjb45pcseR4bG-TpINdN1MWsWEtDqu6RBfTfKceE.pb
    Oct 04, 2020 12:46:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 04, 2020 12:46:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-03_17_46_09-14998041978943373440?project=apache-beam-testing
    Oct 04, 2020 12:46:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-03_17_46_09-14998041978943373440
    Oct 04, 2020 12:46:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-03_17_46_09-14998041978943373440
    Oct 04, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-04T00:46:09.860Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 04, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:46:17.922Z: Worker configuration: n1-standard-1 in us-central1-a.
    Oct 04, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:46:18.786Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 04, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:46:18.827Z: Expanding GroupByKey operations into optimizable parts.
    Oct 04, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:46:18.855Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 04, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:46:18.935Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 04, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:46:18.975Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 04, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:46:19.020Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 04, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:46:19.055Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 04, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:46:19.626Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 04, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:46:19.722Z: Starting 5 workers in us-central1-a...
    Oct 04, 2020 12:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:46:41.223Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 04, 2020 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:46:55.215Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 04, 2020 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:46:55.248Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 04, 2020 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:47:00.485Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 04, 2020 12:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:47:16.974Z: Workers have started successfully.
    Oct 04, 2020 12:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:47:17.000Z: Workers have started successfully.
    Oct 04, 2020 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:47:53.074Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 04, 2020 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:47:53.227Z: Cleaning up.
    Oct 04, 2020 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:47:53.326Z: Stopping worker pool...
    Oct 04, 2020 12:48:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:48:37.040Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 04, 2020 12:48:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-04T00:48:37.149Z: Worker pool stopped.
    Oct 04, 2020 12:48:42 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-03_17_46_09-14998041978943373440 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 597681bb-5eb6-4be1-961e-8b7b451bc355 and timestamp: 2020-10-04T00:48:42.315000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.053

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 04, 2020 12:48:42 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 46.583 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 38s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/klt3dljhmirqe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1071

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1071/display/redirect?page=changes>

Changes:

[noreply] [BEAM-7505] SideInput Python Load tests job (#11856)


------------------------------------------
[...truncated 273.15 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 03, 2020 6:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 03, 2020 6:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 6:45:26 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 03, 2020 6:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 03, 2020 6:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 6:45:26 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 03, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 03, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 03, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 03, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 03, 2020 6:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 03, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Oc_b2oqpOPSFLEHahmYLNMRhDIKypu-Y3JO8qHbthnw.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-RuOveb-_8BNk4h-oME7vZUZrD7YFyM0Bu2K6jJzSRjU.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Y3bx5Tx4EngdlcGekkp4iyLA_95M6KerFdoGuUjkYPw.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-aLlTOtTx7V1ap-Tz3N4-RuEQi_zZKK8-uE46Ix2A8GA.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-NWKz17McfeW2rq7eP2ZxdOHQBz88RM7syUlXPL590AY.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-Q3AloSD2vVMMfvXbJ6q-Hy0MvlKL9UZpK38M2tWcFcI.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-x6-6O6HnVgxWYTCm07Y5iShFX-E1WokPzwXlmtaEyx4.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-9sP8M_dQDrv3KoQRYIeSkyEVGrYL2SYu0YrC2Fy4QKU.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-coVV-kxnbq7ZLAsLLLrw31_puxM5NA9rf0TFyrY0Al4.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-H3MbfM2tkRtSE8JJfS4sGqc7rFkIG2JgaZuW0_q9Dbc.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-C5PCdfC1QcdUorATJc68EU7A4YuoqZ_c6RucCQw8rHA.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-Syw6XYlsAvgr8f6XdkWPUj1O6kq5QNOS_Fd4lxpcD68.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-27AU8N0eSvLnfLuT6qJ-GyaQLnrk9D2u5UouTfEHOQg.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-PAq1zCwNCKgkUOtb8ffOH3TdAD_xufTE6ATWAeWMTh0.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-lmmtsZx3JvJWegMTJtD9Eq1Um1DXuK35uqnZritI7iw.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-35pGH5WDI1cCH9wSqjFeAm8jqzoI1nkPWg8EO8fJ9No.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-TZtRFG5NJIUK-S_A9eFSlpGfKac9Obvczg2rGimvg0Y.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-cHEEfRukjCELoG6klKkMamke0HqLgeXJQBiLyIWUY2Q.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-vIKNGu7AP28Io-HLuIN-svv1CojYx9cPIUMVinrGTUc.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7599833383359828160.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-3MkQr-of-yChrtue1r7UQJ1rbss_s2c4glcCoFON0H8.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-1Fsl7lVLHmkLJ8A1XPpV-OIwNjuV8JmGto6Ak97vzKI.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-jLcpIk7lW69oGGOOQbTEWELpWD6Cnjlq0Atpo0Iq__A.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Oc_b2oqpOPSFLEHahmYLNMRhDIKypu-Y3JO8qHbthnw.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-WSfq9pGP7SPEYqTFib1cEK_xHjCiRyCK9hTWg6iRNWI.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT--tf_hbuNqVio3uIRbs_P3pnkD4_bjx25E6vPhBc9UjY.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-o3rcn6vUgIY8E_LVQH5A-kMgEmaHpl6Ohdd6VFpWBD4.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-_rHbIwbfFsr4_zQnsZ0QEHnVTG8W5ImYr798xqH8hqM.jar
    Oct 03, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-Eq0WXmslpeZJStM-WJobusXFevPm0Jv2L03hQlLs_-E.jar
    Oct 03, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-UMi6OmNP9Jha22U5JhUU_GRL-qPwjh4-6ia_Hst2kBc.jar
    Oct 03, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-gJkBn0wVzilQ_Qd5GYGc0fydA7ZBnneb54F409Kqv4U.jar
    Oct 03, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-vjF3SAA1sbt7Zqm26scG5hJncmt2xo9ik6TU_8aRm3E.jar
    Oct 03, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 03, 2020 6:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 03, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 03, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 03, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 03, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 03, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash 06bad99384705e22760ec175dfeab288247e36e5b406778d4a57fb6afb9c5aae> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-BrrZk4RwXiJ2DsF13-qyiCR-NuW0BneNSlf7avucWq4.pb
    Oct 03, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 03, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-03_11_45_32-5093793783836801976?project=apache-beam-testing
    Oct 03, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-03_11_45_32-5093793783836801976
    Oct 03, 2020 6:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-03_11_45_32-5093793783836801976
    Oct 03, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-03T18:45:32.425Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 03, 2020 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:45:40.159Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 03, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:45:40.984Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 03, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:45:41.042Z: Expanding GroupByKey operations into optimizable parts.
    Oct 03, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:45:41.078Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 03, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:45:41.145Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 03, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:45:41.173Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 03, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:45:41.200Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 03, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:45:41.230Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 03, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:45:41.954Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 03, 2020 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:45:42.030Z: Starting 5 workers in us-central1-c...
    Oct 03, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:45:48.563Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 03, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:46:10.757Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 03, 2020 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:46:29.364Z: Workers have started successfully.
    Oct 03, 2020 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:46:29.456Z: Workers have started successfully.
    Oct 03, 2020 6:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:47:04.642Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 03, 2020 6:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:47:04.818Z: Cleaning up.
    Oct 03, 2020 6:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:47:04.909Z: Stopping worker pool...
    Oct 03, 2020 6:47:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:47:57.236Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 03, 2020 6:47:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T18:47:57.288Z: Worker pool stopped.
    Oct 03, 2020 6:48:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-03_11_45_32-5093793783836801976 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 679bbb35-ec0f-49e7-a41d-25b01eee610e and timestamp: 2020-10-03T18:48:03.216000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.222

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 03, 2020 6:48:03 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 45.904 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 45s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/anvardk24e5zm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1070

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1070/display/redirect>

Changes:


------------------------------------------
[...truncated 271.46 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 03, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 03, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 03, 2020 12:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 03, 2020 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-NsLMcqWAM8FhDBOfMIQZUycimLwvXsnrPclTzNmIO5E.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-ic0CUI9W5M3Dwl309iMEkLTKQm2fu6-awnlS9HRUmWc.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2923330005902339042.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-_K_caHnb7zklqVjQS5M89a3bGefepLYFW72-X-PUZRQ.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-SpWITSXKCSkeaxE1L9EykNLwIEMlUx9RgYSVhuRK3Wc.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-EU2VPcKVX4Wk_7ZfejK89QAMYHyke5TsUXlxbAYtHk8.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-eUoC8aPf18hNrw3zEuLB9h-LEEgZRkclyOc0utA8td4.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-Fog96E9trKo6LJEFWC4M19oMpYK48mB8TWcl1_M9OdQ.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-g7R99yjRZTKShXs0vG8YnhcKywLkOjYyVEo2-2wjtFM.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-AOWmxMlOlz6c-fXH4_M4m0Wh0mzJXqEvhd7J8v9syPU.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-isB02RmpdtKuYNYaLDUTEWFcCxboWRSzb3VJT1UE5dc.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-AFL-y0mu_FsGxhZyIy3f1ByD0FzdZVUJKNmFVdoonks.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-A-n_RtIdum6lKRlKs2XSO1n6lvvPnNjcHjMN90f_ACo.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-jzLq8gMaQX14UMIw3FUhdviRfQUKtUIaRgE6J443ZGU.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-Sa4AKunak45NNhU8O9EA88rAZ7QUK_iWqVZYhdJ8iKM.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-OEmlrLnj7Q1odypd12Wn9X0y3cY-eJ8o9W4XyML0gsQ.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-HALwoYu12UUeNIshfos7P3MZzKi3Ylf-OJwyHFtgwnA.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-yXwyZTJINiRM6KDOqCLi-68oeC7eJ3p4qhzXkcwcZOM.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Ncy1x_wP8sT-nqAfQ5ETgI43ppQIndZxYv0OMxaKGuE.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-9lwGfwfOqdXONSTx--J0Izh16CdbJJbtyxeqgyzeOOM.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-587zoXd2Lblb1j-zRmDIsN9RVIq727-t9ZkDamWJO5Y.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-wZnTWrf-w0-lt5jhWxOK0Hj1XSoeFZP-KdfYP9Pe9H0.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-ZW7eNjA_nr4zjMGGGlWkjCGPpnYZT5DDpm2Kyu0wCpc.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-cgP3iD7MYK8QcA0N4BsZkAbMrSx6AUQtj6pUjjNA_aw.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-NsLMcqWAM8FhDBOfMIQZUycimLwvXsnrPclTzNmIO5E.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-pJN08dgTOMFyV2ONHRPUgxugOtaxAD1MvnENBzonMv8.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-eGGOxaVOPR_h_FnxjiSZSu8tO6krlMVomAX89YgUxuU.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-rG0U9Ry5OJaCy9u026Z_UueaU-Aildsz1ABqGO6nmMI.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-xfwVLcBTVGH8zUpZBC5sOgd4NEPnaKo0X1w0v7Q45Cs.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-w0s30PXVrEeYl15XKK74SOc3NMlSvN6H_7oMUNtnfKQ.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-pd8GCYYUzsx4Yj9Fc51kOoF5OAK14fWtN3tvm-JvoDg.jar
    Oct 03, 2020 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-uYRFOyKAl5KG7v-tJe8pDQUPu1FfF0cFvVPsn9qwDZ8.jar
    Oct 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash 0c675b56f174b22f145621d0d4d3e40a4fff223220c67553f595a6dcd125ad3a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-DGdbVvF0si8UViHQ1NPkCk__IjIgxnVT9ZWm3NElrTo.pb
    Oct 03, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 03, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-03_05_45_15-13655537526396020451?project=apache-beam-testing
    Oct 03, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-03_05_45_15-13655537526396020451
    Oct 03, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-03_05_45_15-13655537526396020451
    Oct 03, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-03T12:45:15.964Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 03, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:45:24.425Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 03, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:45:25.159Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 03, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:45:25.197Z: Expanding GroupByKey operations into optimizable parts.
    Oct 03, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:45:25.224Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 03, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:45:25.298Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 03, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:45:25.326Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 03, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:45:25.359Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 03, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:45:25.387Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 03, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:45:25.837Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 03, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:45:25.916Z: Starting 5 workers in us-central1-c...
    Oct 03, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:45:49.634Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 03, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:45:55.945Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 03, 2020 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:46:08.377Z: Workers have started successfully.
    Oct 03, 2020 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:46:08.410Z: Workers have started successfully.
    Oct 03, 2020 12:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:46:45.599Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 03, 2020 12:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:46:45.828Z: Cleaning up.
    Oct 03, 2020 12:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:46:45.897Z: Stopping worker pool...
    Oct 03, 2020 12:47:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:47:32.316Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 03, 2020 12:47:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T12:47:32.454Z: Worker pool stopped.
    Oct 03, 2020 12:47:38 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-03_05_45_15-13655537526396020451 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3e71b6c2-b954-4a02-890e-4a91926306ca and timestamp: 2020-10-03T12:47:38.110000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    19.415

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 03, 2020 12:47:38 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 35.372 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 23s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/j5mfkcopnchng

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1069

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1069/display/redirect>

Changes:


------------------------------------------
[...truncated 272.34 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 03, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 03, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 03, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 03, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 03, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 03, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 03, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 03, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 03, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 03, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-k3ryegkdiYVwwoX8c9W44-0WH7W8B7NIP5UsvzNjofo.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-q08Ku77Sh_Ix7VOJ9a-0rNpX167hwZRFUDoMyvO_RXU.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-03ntw2w_O8cQXHgZPlCqzlwa2V1ntq03ImAKPdpjHWA.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-BVp3R-0lk_NkA9q_-l9CfU5WGvgS9Mzp6GoebVszKr0.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-eZHvPVl_8ZPG3A3VUAcUGaSbX849nIupKVhBKOn_jGs.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-iV0NABNybU9fRCJN1Fal4IzzUVwfA0JyXKYgkhQkiXg.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-sm8PCd7xyh2gDTNZ7_tW017BT23b-UwjelKrXI3VDeQ.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-xOAMqOvDWF8FVC8Y8KBqrD91ejw4h8KAtCIfmoJ9p4w.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-qjlXY3b58p_xUvXitR70h-7NehEZ64k_V5fKahYPnok.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-u34qUoXoTxOG1xoeHSEaahiQofZWHTJfW07FmO3gPaM.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-RVH1TbVa0SdQ9Fcqd4Q7XiofeKIhg2CQnbjZYZDSbdU.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-G7CvKCsH7yqKKGU0NOIZmhBC8d1LwqIESk5XlPI8VK4.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-IMg32JyH_ZqaUwavtiWyM2tkzady1EsIwPnOrx1NXUw.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-x0FZFQtG4WGP5N5eZ4PrjtkzOtOsfWjAYr40m15Iw5Q.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-BYYTcujW1tLy80Y9syeTbwyYsviyKIYJ9r-37TnwoQA.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-k3ryegkdiYVwwoX8c9W44-0WH7W8B7NIP5UsvzNjofo.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-zR9x1cIqUy4qpewNtpOtQ21gGJEMvz7xgWluhNkZkzE.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-VS9BSxKahSkgO1nEe8UfLCyYV5aT-DDmsjA9GSG8CgE.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-l9KIu5t4wxcP02QIWbmap_OU4HCD-aPR0mtTfmr4OTs.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-cMUqXcetrqe88PVxCn0SVNkhNYmGxV0qp7hiWHFWDQk.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT--sFzWZpp5HDJqVjM1tZiuT6dL4XmMH4RZiyck6UC9cg.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-xtwDObvEtQXH9f25ooki6e-RfMPKZ2JiEj8lCE6ywxE.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-UegKnRPaFjqt-AmrJh6n892MKRzu4ngEjSKcFyVJr-E.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6952840879910391612.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-_UOCVLWIY8z-6TS4WNQM7RuqWMhOw916q4jYm60dyC4.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-iDB9fVVk6VkxpQ6tprSPCiFYnFmAn7bRxkVfCCXhxxc.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-G2wpRqgINvYGDwFu7cA1XnZgEjfpLLslNvSj6dkIDp0.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-VZZMtpV0tFINEKsLo7BCrkPbwxFV6aMuVLvK0BN9REs.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-WvSnUCqRafreHgVfCKJei3k_R_J8Q2g7h7fE7PvETlI.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-uESBY8LpoCn9WG_F_YLwYR7YR-Gwj5dzVfdZ07NFIvw.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-LsKJYQZ-F9NEo2tSd8TkT4f16mXFRK2qMFlpEJ2x0d8.jar
    Oct 03, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-cidwls1ChSxoRpXbx2pHTzlU33tIwsH6CXUIeYfaYAo.jar
    Oct 03, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 03, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 03, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 03, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 03, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 03, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 03, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95920 bytes, hash afa8262c35884484a569f41bdbab6e1ab88c1ed74b6808334cc45575be083756> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-r6gmLDWIRISlafQb26tuGriMHtdLaAgzTMRVdb4IN1Y.pb
    Oct 03, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-02_23_45_18-5771087400379037969?project=apache-beam-testing
    Oct 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-02_23_45_18-5771087400379037969
    Oct 03, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-02_23_45_18-5771087400379037969
    Oct 03, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-03T06:45:18.696Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 03, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:45:26.559Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 03, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:45:28.311Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 03, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:45:28.352Z: Expanding GroupByKey operations into optimizable parts.
    Oct 03, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:45:28.397Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 03, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:45:28.460Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 03, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:45:28.499Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 03, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:45:28.522Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 03, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:45:28.550Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 03, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:45:28.987Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 03, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:45:29.071Z: Starting 5 workers in us-central1-f...
    Oct 03, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:45:53.960Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 03, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:45:54.000Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 03, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:45:58.853Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 03, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:46:15.919Z: Workers have started successfully.
    Oct 03, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:46:15.951Z: Workers have started successfully.
    Oct 03, 2020 6:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:46:19.921Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 03, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:46:51.189Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 03, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:46:51.345Z: Cleaning up.
    Oct 03, 2020 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:46:51.434Z: Stopping worker pool...
    Oct 03, 2020 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:47:42.314Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 03, 2020 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T06:47:42.349Z: Worker pool stopped.
    Oct 03, 2020 6:47:47 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-02_23_45_18-5771087400379037969 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 06877a97-eecb-4a1d-8e26-9da1f3eafa41 and timestamp: 2020-10-03T06:47:47.758000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.876

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 03, 2020 6:47:48 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 43.094 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 31s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/z3hlajblbqmm4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1068

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1068/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-9399] Change DataflowWorkerLoggingHandler to report errors to the

[Kamil Wasilewski] [BEAM-9506] Evaluate gcs_location at runtime, not at pipeline

[Luke Cwik] [BEAM-10670] Fix impulse to use MIN_TIMESTAMP and not current time.

[noreply] [BEAM-9547] Implement dot method for dataframes. (#12990)


------------------------------------------
[...truncated 275.08 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 03, 2020 12:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 03, 2020 12:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 12:45:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 03, 2020 12:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 03, 2020 12:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 12:45:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 03, 2020 12:45:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 03, 2020 12:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2020 12:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 12:45:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 03, 2020 12:45:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2020 12:45:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2020 12:45:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 03, 2020 12:45:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 03, 2020 12:45:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 03, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-kd8WIVN3kjtiyv6EzB06ymqIcypz7rtcZmKAg3VjSfc.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-LaCe2SyokZ8DqPO38ir1PMEPH5Ijch_ntuSZ87KYirI.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-P5VKbvo-E-NrCczSsiaxmSug8xlrZX1AeyIPf-5_1qk.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-lXiaXmX8HUVzsI1gTSxJDJsY2F2bjoGjp2BjkQ-r-xA.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-pu9aKeevMEdl9UgC04Eb1cVdWaARXzKdNV06HCvrpdA.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-7XCNoaPEbopuS0QYYL1k3pLR7hoYbBhLzcPQfONcZvQ.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-HTVZnbGKRiRmIYjJiRTi_6Onw__BViqfjCd0SIHkmnk.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-kd8WIVN3kjtiyv6EzB06ymqIcypz7rtcZmKAg3VjSfc.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-gI4TuEmIj2wu9m4QDSKyIHjD9c5xfVnD-pJAy4LEOnc.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-EUYqwWBEk2e6JFNj6O0-kfpNgwjIM5XHtM9ebUFICKI.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-kQPdskmlhGu_Xj5nra6EiQloTB9a_M_YHbHlAriL5bo.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-YMwmLMNE9lZoiOg0nlAEztrtBZuM5uaaNY-yKqMc5HQ.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-2fmJim1EdawKo8fQW14g3vFe_bFIpLYv_D3XyBGoRMI.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-pwu3DCk4CsUjiU7p1uONYYj5edAG_a6uz2a_r1AHWyY.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-DcdL5OnRb7Dg0fMPsNokuxHgoTAe3F9APC7eREtyx1c.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-6IlhFnM_iZf7kMJi2CO9SHgSfAFRfaxwb4vOwUQlxiY.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test991930488145804039.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-DNNe4YVXhCg1gchw3RDdhUHjEsFNPWU-x5eHHHpOaXY.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-2nxNuJrl-LtHkoxOT0Ai9Iti8eZJjjbdl29TNd7CITc.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-yYu6UUUiGsHlEEnvE2yea9EUZXmZ_LAu0iyS1pOdWEQ.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-bH5sdQA9xDPmieZzF7kLtQFSKQHac5KrDJTsMw4gI04.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-9FbWbEsz5uZ7D3kDcFSTUICxqtOA_uCUG0665lxnNEc.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-_n9vpUdGyJM20xlcgw1ZX6z3-a0B6VANan1S6g8MARI.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-vlggSUAE7NuRLI74VmpBwf0hg9PCwfCz_xydQ65DVxs.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Y-kvM4K4Gxuz2zoB6o--Htv2GCE0J7fUqjMlLzTIpyk.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-wKKmJ-MFJih8Lg-Wg1Sbm4xehkrIcmhCR_IEbp5JzCs.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-4IM_HFP99mtqd7LRzOcFLevVEaECTL5NZ_jpMYxJ68Q.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-BhJUOuBmFqwYWOfh9msaoHNFhlYtzIMxr0rQag0h9DU.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-UmSnzH31wiqWMy237SIuAki_qmolDvLj_vLTJ88yrmA.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-DBMs5oA4egqwfjmN-s5tWjyq1sGYNSmvsnBleka8XMw.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-7o-dxAdCwCxI_UeMqSwJM8VsKphqw_jJVH3pwp4_WDg.jar
    Oct 03, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-c8gJI_PjPcKG_gzrZZoHqe_M9AMm9J_4bURzsQW0UNw.jar
    Oct 03, 2020 12:45:49 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 03, 2020 12:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 03, 2020 12:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 03, 2020 12:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 03, 2020 12:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 03, 2020 12:45:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 03, 2020 12:45:49 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95917 bytes, hash 033e164f8e44933c69e72a966dcf2ab418449bb4e951738edde3135c4ae08c7d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Az4WT45Ekzxp5yqWbc8qtBhEm7TpUXOO3eMTXErgjH0.pb
    Oct 03, 2020 12:45:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 03, 2020 12:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-02_17_45_50-9300022534963542264?project=apache-beam-testing
    Oct 03, 2020 12:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-02_17_45_50-9300022534963542264
    Oct 03, 2020 12:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-02_17_45_50-9300022534963542264
    Oct 03, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-03T00:45:50.149Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 03, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:45:56.985Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 03, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:45:57.595Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 03, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:45:57.705Z: Expanding GroupByKey operations into optimizable parts.
    Oct 03, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:45:57.735Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 03, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:45:57.809Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 03, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:45:57.847Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 03, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:45:57.891Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 03, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:45:57.912Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 03, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:45:58.514Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 03, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:45:58.607Z: Starting 5 workers in us-central1-c...
    Oct 03, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:46:24.503Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 03, 2020 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:46:30.664Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 03, 2020 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:46:50.795Z: Workers have started successfully.
    Oct 03, 2020 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:46:50.831Z: Workers have started successfully.
    Oct 03, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:47:21.093Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 03, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:47:21.226Z: Cleaning up.
    Oct 03, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:47:21.298Z: Stopping worker pool...
    Oct 03, 2020 12:48:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:48:11.393Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 03, 2020 12:48:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-03T00:48:11.427Z: Worker pool stopped.
    Oct 03, 2020 12:48:17 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-02_17_45_50-9300022534963542264 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4fafed2a-7241-41da-9dea-f8add647b744 and timestamp: 2020-10-03T00:48:17.908000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.617

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 03, 2020 12:48:18 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 42.508 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 0s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/e7ivpsqfzxboa

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1067

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1067/display/redirect?page=changes>

Changes:

[dev.vasugupta] Moved getTableSchema method from expand to setup in ClickHouseIO

[dev.vasugupta] Added a Nullable TableSchema field in Clickhouse.Write

[Kenneth Knowles] [BEAM-10991] Adjust CHANGES.md to indicate rollback of fix for BEAM-8543

[Chad Dombrova] [BEAM-7746] Add more typing to metrics


------------------------------------------
[...truncated 272.10 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 02, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 02, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 02, 2020 6:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-GJ2aioZzU9QI0k8l2RFj0H_QE5-m3obrDORnhbXfbG0.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-XCf7qNpu8Yaup2IjQnnBFWsJOwDA_PiMYDLdu5D5XEA.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-HFi14SguN4CcQd91L8XQhENyOExrMIoZka-PcRUiopU.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-NnZBc0x6D_1ZpGbijPeiCayyNU4AyiaTHrP932mEvic.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-vgHX3R5vQ8G6dU4KesH1OJDkgoLn1YNn9ikEz-Gy5Hw.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-uo-veNWWxeq9JDGDSYSJIpuiVB6M6hYc5XpZGWZTj5w.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-W0rrH3LHQ6mk7ZmiE6q-vO7ZGlCNYWUxjvPYXoMRjic.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-xZ5v9QloegOPvnIfjxLwLjYOGIEM0MPmOVPeYDx6TbU.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-DJpQGzfnp7VJTurIlGRtv6q-j44792xDkh7uWcOFWj4.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-aiukFHrezU-dBWNhapQkbyOI5wbmM9j4-wteRDIbzxI.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-JjRpKtQ_vo-gJOg_UddSw1QoCep6p7flRp4Mq5gI4Ic.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-LgWmWor24AKvHfs5Z3svCsBFoMUb-3qYo2o_C-jPkfE.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-uv1RQ1kuHqXX4fOcGLOW4ZsTV1iwR6wSAYDhrDLE-rQ.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-SwFIHAulHJhg3BmojnBO8jqKPG7ShMbJHnTUC2gooZU.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-dHfjjSDUIqgIlLUVZ15rwg-vh3cqr1LhBBJQ6A8q3J8.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-xkFQe_scmistkbjXYNotBGtNR0ILuTDaB9zQc2QV6Ro.jar
    Oct 02, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-nWY2WXD25p206C3ScovUXvrBlyZkFqU9_tzBz9FK5mU.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9082496020580768236.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-VRywD_rUPH1kegwGbOrph1JX8mugtTWVVm2m4mq6WbA.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-DseXzzcmcMZwbLRPuj8D01RSRhuOYL60m0zbh2X6wrg.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-T-f1yKvwG_3If4WosmHnMza6C-6_Xcko5LT0Ibyths8.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-63h5HS64wsV6E7ksiriG32bw9CaSO-NRihVttP6Ooqk.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-aM0pm_HsADiJXQNB3T8P5cBJ93N8a5KjOR1fjQ_fvQg.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-hy3asmqcFNEoKmlrKU2tx_0EEhVVHeBI14KGswkKLgc.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-pymohxwlxLotPIyNIAfnWOhOPMJFt5bSbnH55v1xfFM.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-zdypmvZPqYDzFJ6JxsVefaeBIoI0BlrnaFNyRwgu2qU.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-G2saRUsOEKWASQA_lEmqFmnt6AtUlUBBmXOARRKlPpc.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-xwCbAphkE0ccTAJpLFjddfSCvKiTannB3D4TUNcmFt0.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-GJ2aioZzU9QI0k8l2RFj0H_QE5-m3obrDORnhbXfbG0.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-ktjqoPEH4k11pK9jvg8pZDdEM6q1xOqvKZLewac9P9s.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-jx2btBXnrTiqxgbmDaJV4UWCZN4MirBu7-xe3B7ribY.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-JtNVOpw2WLVjbq39o8MlUk_EokQ0-_HTi8dV8GmJTJs.jar
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 02, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 02, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 02, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 02, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 02, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 02, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash 650c8a9e03065550c818ad488c40374fc0d36130a37f5bd8dbfa7ab9d1e1376e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ZQyKngMGVVDIGK1IjEA3T8DTYTCjf1vY2_p6udHhN24.pb
    Oct 02, 2020 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 02, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-02_11_45_18-2970834532740667993?project=apache-beam-testing
    Oct 02, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-02_11_45_18-2970834532740667993
    Oct 02, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-02_11_45_18-2970834532740667993
    Oct 02, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-02T18:45:18.617Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 02, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:45:27.311Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 02, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:45:27.924Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 02, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:45:27.971Z: Expanding GroupByKey operations into optimizable parts.
    Oct 02, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:45:28.016Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 02, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:45:28.104Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 02, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:45:28.132Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 02, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:45:28.153Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 02, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:45:28.196Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 02, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:45:28.630Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 02, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:45:28.709Z: Starting 5 workers in us-central1-c...
    Oct 02, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:45:34.616Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 02, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:45:56.118Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 02, 2020 6:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:46:14.398Z: Workers have started successfully.
    Oct 02, 2020 6:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:46:14.430Z: Workers have started successfully.
    Oct 02, 2020 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:46:51.329Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 02, 2020 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:46:51.473Z: Cleaning up.
    Oct 02, 2020 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:46:51.549Z: Stopping worker pool...
    Oct 02, 2020 6:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:47:36.331Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 02, 2020 6:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T18:47:36.381Z: Worker pool stopped.
    Oct 02, 2020 6:47:42 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-02_11_45_18-2970834532740667993 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e2b87e54-c897-438b-83fe-1701fb35d189 and timestamp: 2020-10-02T18:47:42.702000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    17.242

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2020 6:47:43 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 37.967 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 27s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/zup7kpomtgjfg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1066

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1066/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10639] Integration test that exercises --setup_file flag on


------------------------------------------
[...truncated 273.81 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 02, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 02, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 02, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 02, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-iGaPWM9LLuvXgb5cfijK-k_wFS3RK-O1bBZDT7bi9iw.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-cWcAU6upjcw2_ofnA7QRzaS9_zn78xzyX_wpG7DWUao.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-bjoyxvLqiTcKpIjL6SyDzuWhiFtFQaZbBGaH7hE7rxU.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-sgWvYoohmy8RibMQFcSa9IFvJi4iqVoxsI1zSUKQYFA.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-bIH2YQMbzfLqfFctUleaij8oJ1eI8P14uhYxPadfzUY.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-RCoCZVG85YquElrxLIXWvC-8MUJWaxaJ6_jgl6u2brI.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-nJUi6WCg7Tp6UVvycnyxl7xsGL4bFbzgXuhIAEK3Ff0.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-A4_DvGAARue22S9YkuVVa9EzB12cv7N22drJzBUYFWo.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-A4_DvGAARue22S9YkuVVa9EzB12cv7N22drJzBUYFWo.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-xVWj3FdS2rSBU83AAyHzIDh39y5-UOdCxDJtXnVaP2g.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-LchLfu9vlKviTmYvQsJX5lQCujNfS8woBd6WSS92OxY.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-cccQ7TFZ_QkBujdkaeCeO7y50Tcx1XaophALgCNItGQ.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-16qlPeROrNysLpCP51DLp4tFwJux_eDtH6KE2oLL7BU.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-2iX1b_V-rl1G3O8yEig-QOE5T2rcRO52YyQPCFSykqU.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-CsC-EpHqDy6_5bNl5qVMkFINCKMe-O8Vnl5-s9tb79k.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-wP8pUatu24WSdH5IpYlYE5mKNO9CoVU8y0UigxdNsoQ.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-19faszcH2L0RpccbF-OT9FxqumkRR3Yx9E3EJv68qR4.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-MHHYXzVUPyJudWd0Ov5m7MiQtA0q3ioQtdLTYkXm_hQ.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-iMhkFZNvYqeifHMlzyeMhS7tAckiXi_eU7D3RIEumbg.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-AVqOEfd4lMRabWHqbNOFOenff2t67XbeHBfylypBiKs.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-gAcwkqP2V1NNfeMXRL87k_k2GgaoBNNlofnSEDil5yE.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-3og-OMNovZwIRBq9RpY-6ylJW6L-WZDxtEosqxSiCSE.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-gwPsdowjUSY12bGdhMmVEAK25W-aXHoJAIxISFAGzPk.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-NPThWnCQIYU1rM3Zs4A0yGfcLJVZvBAz_l1dvpgYQWc.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test758263520880191646.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-lQjo11pngTLpTxgJ6ZzKLDd9XK3sM0y6GZ_NKh3xzsQ.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-4KhxoWSI-OfI1Z8_qsHbUHyMaMr7H5qW-evBZa2fQ44.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-v6lmKWSZ9PmziMK85taQnexOrvJJC1QylzEjHZKGfhk.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-ccloQdkovIW3F3Wwc3YpKKVD3Hdrfs5rKxaK2b_PWvE.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-GicTojMP8GNjoVhBstJsAR-CSQ1xxZ6sqbUOO23yzDk.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-c4PzeqB9JHLL-LJdOZ9yAAqbjAD7pobKBZ6F6nJGtSM.jar
    Oct 02, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-1AVwikicRXlgtN8GroEK8sXh-nhT4303L0kmRObMcFE.jar
    Oct 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95920 bytes, hash 767ab44d8e5f7a6e8a670dcdc1a3c93f1c29f42905d42a53d314e865ed92dbec> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-dnq0TY5fem6KZw3NwaPJPxwp9CkF1CpT0xToZe2S2-w.pb
    Oct 02, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-02_05_45_18-11045901659230851146?project=apache-beam-testing
    Oct 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-02_05_45_18-11045901659230851146
    Oct 02, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-02_05_45_18-11045901659230851146
    Oct 02, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-02T12:45:18.677Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:45:26.662Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:45:27.405Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:45:27.445Z: Expanding GroupByKey operations into optimizable parts.
    Oct 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:45:27.484Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:45:27.563Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:45:27.583Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:45:27.618Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:45:27.648Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:45:28.013Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 02, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:45:28.113Z: Starting 5 workers in us-central1-c...
    Oct 02, 2020 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:45:50.626Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 02, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:45:54.898Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 02, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:45:54.941Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 02, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:46:00.272Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 02, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:46:14.574Z: Workers have started successfully.
    Oct 02, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:46:14.605Z: Workers have started successfully.
    Oct 02, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:46:52.201Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 02, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:46:52.341Z: Cleaning up.
    Oct 02, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:46:52.424Z: Stopping worker pool...
    Oct 02, 2020 12:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:47:49.229Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 02, 2020 12:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T12:47:49.265Z: Worker pool stopped.
    Oct 02, 2020 12:47:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-02_05_45_18-11045901659230851146 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 15f98645-1d5e-4c22-bf8f-fc6dd2403596 and timestamp: 2020-10-02T12:47:56.057000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    17.194

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2020 12:47:56 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 50.634 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 39s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/dqsemdwepkjuw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1065

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1065/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10862] Handle empty tfrecord files within a glob (#12790)

[noreply] [BEAM-8024] Add dataflow and flink runner to JPMS test (#12970)


------------------------------------------
[...truncated 271.77 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 02, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 02, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-rFvUdoIWrjdzLEZJHqPqNYqJ7nEFzoweNxOSZmLtgcQ.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-p_kYo8I6YKfykHIZ-OO4aYbngzqSF5AEjz_kJVmZMHA.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-zhDZkF-YdysDteLsOdCctHUpqmqaoQ_tB9zmQq3nhko.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-YJL1ZoJpFPAa4VLjthuqagi9K3U8z8JSQASIhmagDOw.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Ylt5GcD_pMtpKK5nlV0vLGjg8z3JxiXHd3pLDlVsHaQ.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-rFvUdoIWrjdzLEZJHqPqNYqJ7nEFzoweNxOSZmLtgcQ.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-STXSs4hXdHVxURkB-kk-Sbj3KgnfJZJfjvi_3liJFtQ.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-loCfLy53QHU6FwY5AssD8TNCUYhsvkOohJZu5WQ_f5c.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-KoM34YFipWXc4AS_4YC5xSDG2WlYW5RJOZxbFkVaJj8.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Ah8Iq7_13aafciDvbx_izMwA67aOdnlH1CD47vmW4VQ.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-cF_iO3tj9aB-u1H2seqDgfNxE4wHX_d2e8WiRblUr6M.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-4yNVfgZAlnE5jP3P0o8LstOImpF3JQeB_rmG7_Mh61o.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-UqrR7NLrqIBJvaDeyHexXFCPG6wF8maUFWZbxIrgxQ0.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-z1roQcFqgzA3O4zT4sJpqVbnFtc6JvuVYvi2IE_lR3U.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-pL3uJWzb0c_JFsvoZm6sq3RBrNqMnicDifEUHHwcCik.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-2dSXh1yRQq996_dpQqhDd4RuK6jINEWl5uCslvkQPmg.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-riWFH6_-cYTBY-ZO4vrpF1L7rGVPA6EpTNj00rnDHLs.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-KwuwgR2l94rncKCVp7lHHPcQ6CTkvzzJDmADDonrYTM.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-8gxKb3YYvGAoKQv_n2SWIjpEBYNMk52-LAb6ECCG95k.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-n4aagjFSGXU5vhDCde23fYwbehSrmIDSt3etQhpCTCc.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-fiVq30vAHVr_xL0vALVBTnxuIJDY__urgw_LRM-drB0.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-VtQn3buJIaA47KsdzGJo7aAnq7EutFGAvKPi48Z9ZwM.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5863727385268716023.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SK5A3BSn2NO7xEoCs79DPdNV7-z-gvkkjtVwl-XjCBw.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-oUbGsEFJvQnxvraW-jwfAZGPOZ-jRGoQsXcoHVbkfyc.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-EnrerS-ng3n5C9lHS6gJ4Ra7woEoUP6o-ypLHQxEL1I.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-Js7sX0uLXcITeV_qLyiDVCxshDy9msf4-jEtBd5alpQ.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-k0qdD13TDtj30S_CJHRoV7wfqXa5Z5umP9s3KhSahsM.jar
    Oct 02, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-IYZ21Bwr_BnVmOgfBMAuNIeMGuBnDnJMo6z4PS4Lyf0.jar
    Oct 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-UEYOkVlhR1O615MX6zbIo886Ki5lhpk7qJRI4p4rYfY.jar
    Oct 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-uP6MmPj6Mc5vS1w6C5oF9cSnzd32jfaqYN66Ti4zt64.jar
    Oct 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-IkXrGzyWt9Rh4wzgxEbAVNrnxYdao7b0voVWg1YOBrI.jar
    Oct 02, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95919 bytes, hash 96ec645c8d1a9972b0e47f7f261313c1dc785c12f017819f32fd21569001bc34> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-luxkXI0amXKw5H9_JhMTwdx4XBLwF4GfMv0hVpABvDQ.pb
    Oct 02, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 02, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-01_23_45_18-17284623417373366037?project=apache-beam-testing
    Oct 02, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-01_23_45_18-17284623417373366037
    Oct 02, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-01_23_45_18-17284623417373366037
    Oct 02, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-02T06:45:18.604Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 02, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:45:26.904Z: Worker configuration: n1-standard-1 in us-central1-f.
    Oct 02, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:45:28.582Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 02, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:45:28.620Z: Expanding GroupByKey operations into optimizable parts.
    Oct 02, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:45:28.648Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 02, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:45:28.730Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 02, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:45:28.756Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 02, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:45:28.790Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 02, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:45:28.815Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 02, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:45:29.265Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 02, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:45:29.335Z: Starting 5 workers in us-central1-f...
    Oct 02, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:45:52.922Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Oct 02, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:45:52.953Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Oct 02, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:45:58.282Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 02, 2020 6:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:46:01.919Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 02, 2020 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:46:10.893Z: Workers have started successfully.
    Oct 02, 2020 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:46:11.000Z: Workers have started successfully.
    Oct 02, 2020 6:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:46:43.077Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 02, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:46:43.221Z: Cleaning up.
    Oct 02, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:46:43.316Z: Stopping worker pool...
    Oct 02, 2020 6:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:47:34.784Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 02, 2020 6:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T06:47:34.821Z: Worker pool stopped.
    Oct 02, 2020 6:47:40 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-01_23_45_18-17284623417373366037 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5e01680c-31e7-4181-90ea-7c1ec9dbaafa and timestamp: 2020-10-02T06:47:40.175000000Z:
                     Metric:                    Value:
                   read_time                    13.011
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2020 6:47:40 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 36.513 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 23s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/n3q6sacecu2ge

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1064

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1064/display/redirect?page=changes>

Changes:

[hirayama] [BEAM-10935] Replace @abstractproperty with @abstractmethod and

[Kenneth Knowles] Revert "[BEAM-8543] Dataflow streaming timers are not strictly time

[Kyle Weaver] [BEAM-9026] Avoid side effects from positional value provider arguments.

[Kenneth Knowles] Remove Dataflow Runner Harness precommit job (it runs in main Java

[noreply] [BEAM-9561] Initial framework for testing pandas website docs. (#12819)

[noreply] Update some of the Python dependencies from Beam Dependency Check Report

[noreply] [BEAM-10930] Use dense JSON responses for BigQueryIO interactions

[noreply] [BEAM-10994] Add hotKeyLoggingEnabled to the Java

[noreply] Fix translations.optimize_pipeline() failure on empty pipelines (#12804)


------------------------------------------
[...truncated 273.82 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 02, 2020 12:46:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2020 12:46:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 12:46:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2020 12:46:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2020 12:46:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 12:46:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2020 12:46:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2020 12:46:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2020 12:46:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 12:46:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2020 12:46:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2020 12:46:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2020 12:46:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2020 12:46:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 02, 2020 12:46:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 02, 2020 12:46:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-IXQUEHaN6yBux5jTr9EqUN4Uswso8LN86FHgou7iGqE.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-ioN5py1ozxfGiDr1u6R45t0mYhs0V7TKcH7CMn4VTyI.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-q1JCIK3NmLhEwNYj6nx4f0ManVHHoTZmdcY-q6mQj8g.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-cYL7GuBSJo5ywsyH3uCDCyxze-HCpMOgERsA8iicGvU.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-VbzPppj10vxBDlO_xdR3vUjuF4IJN2qkCh3dO9k0bgo.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-3401-T22GBc0Z6I95swmmBHDyS988lJkRMStLyuE_lY.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-MEOatvWF7_QYtOvSA6wsq_jNrug6Y4tbsTJF0MfiR2c.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-415q4Bw2OlvD7yyHYqtANcYVTzX9XjurE2YE9EK_Sks.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6655309028340936948.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ngnsrAa9Zrm-yzbjw7kYB7M-yZzPO3GVMEhVXv1-Wxk.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-oGLvCNjuKFCtsSeXYXp0zFJaL-gU6I3WezsUmhlalPI.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-cYL7GuBSJo5ywsyH3uCDCyxze-HCpMOgERsA8iicGvU.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-NPc_LuTpU4HZL-ywI-D0s8uWnBhedrQnYemAtai2GfU.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-SX42M0bly7_7tPV-UdKFXFxW-zYz12QNyDRe6wCh4-E.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-mWViIg57fWMueSavKqe84ier-6_EWCto1MY1WkpJChw.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-cwJBfzFoZrTOvzW4ADLN4YYubZiI1bMKukndyD6zCs4.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-q6gpn1KmvtfL6NidowHxieTynpYEKgo8QoP8-TTxBw0.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-NEAMPXDokjtrUYks0e1GU3pRDw1MMZOTCuH3TkBj9wA.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-DEX5Y3WHkJ93g8-cnapOxhPY8PGt2dlVJcQLa6lYmcc.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-GKlSa-iEiisvkCmavGwmNuwLd5f9Tcr4nVKOSSIu_Nw.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-GWi7VTsYTxLtB6WjUPeYgR5JDqJuGiytJQfagj0Kf9k.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-AlWIlU6uhJtsKKedoOLDNJp_5HJUV-om8ABK_ha1p48.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-pEtwsqyyAgNC26UaYj4YReMgNZ5nLYc1Eihh_yf6jy0.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Iv0Mhs-eGD2KVZepFJUZu6rYp-KbkKjNJaAY1nreCn8.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-fd7M9cgi--qHckWxlEWKzMKYghl5AePnqLmKfRbKq8A.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-G9gd_aDbEyYlhSbQtULXHvHO7pvFkw_aRZet0GGHHUA.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-pIt_wEe6DW19Hr_2c7i3tw99jBfdYeP22CqTJ-9miug.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-rHZQSOc4K4fUZQcg2tfEM6s2l1PrjivLQGb14DbSM1s.jar
    Oct 02, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-8iHZ1ON8aj7wTHSZepWHn76wfpMdQDez8rAts_ciYpU.jar
    Oct 02, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-Q8Dn6rvs9fmzDoVYr4i9i7NAj0Fb38IbKaDkHNcY7zE.jar
    Oct 02, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-6xrcVe2G6jc5RpLiqt6bF_qSlFqG-lzugUcaKho5600.jar
    Oct 02, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-AhTGinSG7vH_rpvqW_OTgQeDs2_lupiMdTAnPHNmj1Q.jar
    Oct 02, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 02, 2020 12:46:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 02, 2020 12:46:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 02, 2020 12:46:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 02, 2020 12:46:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 02, 2020 12:46:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 02, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash 50499909a61b3f2152a8769901ff783d94c8f8b94e3e175124742aa1a2477489> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-UEmZCaYbPyFSqHaZAf94PZTI-LlOPhdRJHQqoaJHdIk.pb
    Oct 02, 2020 12:46:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 02, 2020 12:46:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-01_17_46_22-14583334369285829679?project=apache-beam-testing
    Oct 02, 2020 12:46:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-01_17_46_22-14583334369285829679
    Oct 02, 2020 12:46:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-01_17_46_22-14583334369285829679
    Oct 02, 2020 12:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-02T00:46:22.334Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 02, 2020 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:46:30.253Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 02, 2020 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:46:33.133Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 02, 2020 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:46:33.177Z: Expanding GroupByKey operations into optimizable parts.
    Oct 02, 2020 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:46:33.205Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 02, 2020 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:46:33.284Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 02, 2020 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:46:33.308Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 02, 2020 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:46:33.339Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 02, 2020 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:46:33.377Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 02, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:46:33.886Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 02, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:46:34.014Z: Starting 5 workers in us-central1-c...
    Oct 02, 2020 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:46:52.268Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 02, 2020 12:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:47:05.409Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 02, 2020 12:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:47:27.572Z: Workers have started successfully.
    Oct 02, 2020 12:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:47:27.615Z: Workers have started successfully.
    Oct 02, 2020 12:48:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:48:01.023Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 02, 2020 12:48:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:48:01.191Z: Cleaning up.
    Oct 02, 2020 12:48:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:48:01.286Z: Stopping worker pool...
    Oct 02, 2020 12:48:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:48:55.094Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 02, 2020 12:48:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-02T00:48:55.156Z: Worker pool stopped.
    Oct 02, 2020 12:49:02 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-01_17_46_22-14583334369285829679 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 782e607f-6e0e-47af-ae8a-6064e8913dcd and timestamp: 2020-10-02T00:49:02.213000000Z:
                     Metric:                    Value:
                   read_time                    12.612
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2020 12:49:02 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 53.118 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 59s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/hjstvlbviiv6w

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1063

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1063/display/redirect?page=changes>

Changes:

[ningk] [BEAM-10545] Add 2 show options

[Kenneth Knowles] Add LOOPBACK mode Java ValidatesRunner test suite for ULR

[Kenneth Knowles] Add ULR LOOPBACK ValidatesRunner Jenkins Job

[Kyle Weaver] [BEAM-10527] Migrate Flink and Spark tests to pytest.

[Kyle Weaver] Publish pytest Junit results.

[noreply] fix typo (#12987)


------------------------------------------
[...truncated 281.78 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 01, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2020 6:45:53 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 01, 2020 6:45:53 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 01, 2020 6:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sQ4Y8igyM5u0-aJh9mSIxt0msipb_8Bbc_rprRuubLg.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-8cJBMWLRx_hjjpXP3Keena9ANULdKf_fL9_2aecZX3s.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-hNIvZta-EzMHS1nSvbWYzZFGFYjWIcpuSInmsJpGClA.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-y-PK0Hh73gN39W_Jlh-x_UqRRFxIUyc2oMzwuVXFHno.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-PEIBsl3Pa9C9-nDRLFjEpzjyJJM7PWCPn10ZvlVI1aM.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-_9oBgzFwdy87ITXwut4m-Au247pf4Vei4Fijb9rJyQQ.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-GcJgOuUXdhMD0ctEgQ5i9bCP5XAe1j3GkjOVtouBJns.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-IHVznuDRuU62nQi3adtCCjy42N4rkKDWqHbioyOQHdE.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-po1Qkqs6xfqK5RZBSs8SWM0gls4gLn0N_ZqFx773rQs.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-pgKzq8eq8jGVjgRcEJXjU8s8gDzI4AzPAs1eo0LeNWk.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-wa-kEr5CYGNTt-0_mrjBN4-rLyFG4jWElAYI6pdf3cA.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-gi9m8BPFPNdk2bM8m0a8VjkSa9c3iQuzlfiww9l4XO4.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-ymZLwsoOvLr_drQhTcvT6Fjr10roz52t9ut7uMiMPso.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-13z2EdncOGNOr8CYDq4lDKw2bB8Mh7bpKf6z3cM4GcM.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-OR8KTM-MknRBoI9GYnA9hRLs0_YlN55W4hkm_aHQWww.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-WzVi0VdFAf_c5kxH0L167VBfVIrIjLdvt66Mf7w0Muc.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-abg-OaIQMaDhd3qHNVHGR1Z3NaC7B0PYURtO2DztTdI.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-ikWMAFei5bXKEFqHBBaHshPrQ-0ctgsLmEvxI4dwjMo.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-tSC3cVRClg_rOI5E_ad0E5_0ckkAFGvih7ex-GySRWs.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sQ4Y8igyM5u0-aJh9mSIxt0msipb_8Bbc_rprRuubLg.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8440002854921005381.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-w1M4WLhkjNmFrpQLOkZYNkYZu2jAgFXz1taJ05S54Zs.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-zeoPgFhCNATOorxACIRKG9nZG_An11pJbjlPR_H0WhY.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-Vql4mM5ozKuk49ZdeLCWRSMFmtxgUTesu-BA5BXPVTI.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-4KN8fRyEoNn4hx6nlPk3POfIbfCWE57dct2FDG7PciU.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-6_A4HstJK2BCUmJ5gx4ic6GDjZ7rXq4w6_6OAwX4VAU.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-13ioB1agcsA_9I6K9TaJFFz3MJJAY0blTHRfvP6v_DY.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Moe7yJmV9H-zbL7NPP8ralAXmYftcfzj4YIkje9Shbk.jar
    Oct 01, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-KSsoLAxRqA1jUoSBA2s0tHyJX1G0aT4fG6krkBYQm9k.jar
    Oct 01, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-_rlBXIMH8ZYW_0m1LDC30-O3otc8CYsKMXh47wkO7y4.jar
    Oct 01, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-ul-2aGoFXzAfbPvWR_BW6mK_xbQQ2TGFu2YVY3DA1yQ.jar
    Oct 01, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-ehYsZy7ZtUnWPPzRjNofxfp-rkyS9YP6Mcge1gTHkds.jar
    Oct 01, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 01, 2020 6:45:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 01, 2020 6:45:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 01, 2020 6:45:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 01, 2020 6:45:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 01, 2020 6:45:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 01, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash 703425cd312ff736eafa5d08b9a3f2dcc3f43b0e12a6ee442e1e39964069249f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cDQlzTEv9zbq-l0IuaPy3MP0Ow4Spu5ELh45lkBpJJ8.pb
    Oct 01, 2020 6:45:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 01, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-01_11_45_59-14679570012648808387?project=apache-beam-testing
    Oct 01, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-01_11_45_59-14679570012648808387
    Oct 01, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-01_11_45_59-14679570012648808387
    Oct 01, 2020 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-01T18:45:59.417Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 01, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:46:09.498Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 01, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:46:10.643Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 01, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:46:10.690Z: Expanding GroupByKey operations into optimizable parts.
    Oct 01, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:46:10.714Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 01, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:46:10.786Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 01, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:46:10.812Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 01, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:46:10.849Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 01, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:46:10.871Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 01, 2020 6:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:46:11.928Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 01, 2020 6:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:46:12.048Z: Starting 5 workers in us-central1-c...
    Oct 01, 2020 6:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:46:38.207Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 01, 2020 6:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:46:43.504Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 01, 2020 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:47:03.463Z: Workers have started successfully.
    Oct 01, 2020 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:47:03.493Z: Workers have started successfully.
    Oct 01, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:47:43.377Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 01, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:47:43.539Z: Cleaning up.
    Oct 01, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:47:43.723Z: Stopping worker pool...
    Oct 01, 2020 6:48:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:48:37.386Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 01, 2020 6:48:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T18:48:37.425Z: Worker pool stopped.
    Oct 01, 2020 6:48:45 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-01_11_45_59-14679570012648808387 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ba226468-9283-4b1b-acf0-0c18923c66f8 and timestamp: 2020-10-01T18:48:45.866000000Z:
                     Metric:                    Value:
                   read_time                     18.04
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2020 6:48:46 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 3 mins 2.737 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 28s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/pncb6ydcssgcg

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1062

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1062/display/redirect>

Changes:


------------------------------------------
[...truncated 271.29 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 01, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 01, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-HTDcBIfZ3SJfUVltqlWAuc-ujayoJOYV-NZX-U4QToo.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-0OQvpIgAM8vTPk3vJAPUr-oRtAJ8_Lj3Hi1orf8MCZk.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-tJSumI2GCN1V98XBJnHMSV7vNpxw4cAiCgYkNwqQnQc.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-0OQvpIgAM8vTPk3vJAPUr-oRtAJ8_Lj3Hi1orf8MCZk.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5232237250807346644.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qXDMnZpGINR5Ty_rernd0jHPhA5KDnOvXvEdaWCKMOQ.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-24tzt4q6MPp43pRn3iGXqv2I7D2Btz9nlTwZazxfd2g.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-mnuUupHSE7KyHt2bilSZDw5HB4f54IGUF9A7Vm-8CDI.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-U8CPS3TWhy5-HiRS5iT8JOL_k5U3_303VPXYkZXjX7I.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-SJUsF6hTLG5KCtk-JvJr-PQ-O9BpF6MxHnG_55x1QO8.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-GoDHnj-oBM5DL1wGDmZD7IvwTX5TFbB2HTDeUXG9woQ.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-oWoxhsf5lFkMMi9tPkuCwSSVuuKS9WTAM8mknC9yots.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-hdUMPqEpDowBbIRmt-go_2KSYJvY7MDteQR8uJDDags.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-6KG2u7EkSJ9N6fOCHVfyDZCGR_8SlrqATTkURwDaCjg.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-X0QXwTB7des02vYqCJCIXmfdzrIixxbXVIh1A3WfKrs.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-toQhVF12F6JSRTTlEZHnqC9-wO2h9oTqouyY-T2AMtc.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-oBqrl4QetegYfVK65W-0ZIOkKbsknEEbp_uDFsgCXi8.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-ZhS6QK7Qw7W5pmzQoxO3Z7W8HhIi02NoyAvKQHrYuBk.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-pfZN7sgafLdAcO_nB5TFZw0SdTecHgwRuVHOlAgBydY.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-n4WeWepg1hniWZmU7DNGnJR587F0t1Qet10fjtgoomU.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-jBQkcJ7fBYs-ch40PwL_taB0RZi6PfWGMpEkp6-Fovw.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-oouvhV31cSW1uQTmSmXmLZVXYt5DKKkcxDMqyEtq0ow.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-a1IB3zAPlb93JKUlg8-BCT9jdW_fBnRmKfszWlckBH0.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-_6p5MbrU3W-V3unRDAcDBXqManLbkRzLUaY4jlhFdjk.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-Gcf4zx1awK7eYQAN2OpBTw-XnDadlhj57HTDHENq24I.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-pIhB4BZY8rOtBqp-BvcefHko9aoOWWgwRIp2DEV9Kfc.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Atn2MTXZmT3YiuW4kCq5MnKnczZgCnMONzxnxvY1Vng.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-SMeVSCARQaXXb_ew3_kV-Pze6jMvKdAhZU0RF4dh-Vo.jar
    Oct 01, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-35IGtw6Q90sVZnCHvf4ckiOsJMeizoZ3LmD6YAG36ms.jar
    Oct 01, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-kZi_0IF8fyOx8w6a3Q5kiPb_LqbfcnxDjyaKMfgZqyk.jar
    Oct 01, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-Xexqjf3QXiI7RdEWtP_7QIxYu3TxMJjQJ_e2mn86hjg.jar
    Oct 01, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-poI0wyvtWwBfX6lma4W5hlauU3sQzWn3JuzZugZMy4w.jar
    Oct 01, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 0 seconds
    Oct 01, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 01, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 01, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 01, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 01, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 01, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash 22e0bf4a4cf2007015c3f52f923d64f1b6aed756ea3a6d232168dbec1f0f293a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-IuC_SkzyAHAVw_Uvkj1k8bau11bqOm0jIWjb7B8PKTo.pb
    Oct 01, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 01, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-01_05_45_23-16201281433351477481?project=apache-beam-testing
    Oct 01, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-10-01_05_45_23-16201281433351477481
    Oct 01, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-10-01_05_45_23-16201281433351477481
    Oct 01, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-01T12:45:23.139Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 01, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:45:32.559Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 01, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:45:33.787Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 01, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:45:33.831Z: Expanding GroupByKey operations into optimizable parts.
    Oct 01, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:45:33.862Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 01, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:45:33.964Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 01, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:45:33.996Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 01, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:45:34.030Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 01, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:45:34.067Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 01, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:45:34.465Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 01, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:45:34.543Z: Starting 5 workers in us-central1-b...
    Oct 01, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:46:00.752Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Oct 01, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:46:00.784Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Oct 01, 2020 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:46:03.926Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 01, 2020 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:46:26.110Z: Workers have started successfully.
    Oct 01, 2020 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:46:26.155Z: Workers have started successfully.
    Oct 01, 2020 12:47:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:47:13.811Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 01, 2020 12:47:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:47:13.969Z: Cleaning up.
    Oct 01, 2020 12:47:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:47:14.052Z: Stopping worker pool...
    Oct 01, 2020 12:49:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:49:57.854Z: Autoscaling: Resized worker pool from 2 to 0.
    Oct 01, 2020 12:49:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T12:49:57.890Z: Worker pool stopped.
    Oct 01, 2020 12:50:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-10-01_05_45_23-16201281433351477481 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5de5df1c-4b01-47de-b34c-52e266154b6e and timestamp: 2020-10-01T12:50:03.521000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    27.464

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2020 12:50:04 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 55.636 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 47s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/ldgylk2vwqig6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1061

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1061/display/redirect>

Changes:


------------------------------------------
[...truncated 272.42 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 01, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 01, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-36fvOsk4d5W6EbeImTu0PY-jm_1rLMrmW6tfn6ARjko.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-BFqxFBiRxDFg07HPrQxqFfdekXVl73EiTWexOhMevpM.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-V2OWJm6PIDQK_1N_brt6QepFtoTIuf7TFDwXpmPDtUU.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-emktqDcAUXdR5qbDDRu7UBePmpiavAp-12nwvW2x2H8.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-4sNFxByHQi1zEn-P3hjr7TANmu6LBCVTTcAGbj_USOw.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-_XeOFWpPc0CCwIGDxGg6KL9G27zj3c_LzcCtgWt6D0Y.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1020911360978753597.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-rMe8KAT93jvq7CSy0YTdjDuzkiNPFlqz5hpO7q7fD8g.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-M2C-JmCsH9lZBUMyOUVFkN5NRsYbSLpqKcvVZzo5vSI.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-yaLbJrtKjkwHpR3N3POundU6BiqbNQYUgSeM_5DvthE.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-gSVbV1mvEiKGncaXNDzjTn3nCYBso_LBPvCbrFuuCSY.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-OZm_C8uk58zp7x6dXL-uZ1u0886tL1jFI1NPp2yN32Y.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-yC9eEbhJuGW6HXSufVappcg2v2fiE9WhkLoCbLf2Dgs.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-3PcENXIJNlGZQybpCOpoXDTZk_4XIKIKHk2DoJrG0_M.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-gH6t-bvKSTV9vjfeaT2x-x75DKXNWun5_muiADhmx4s.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-PjTcqww5zx2A_oOeMelWKnx9cO-2vddWSNct9Csh2-M.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-iFPtoBpl0Jjam8cib5htyLxHRPgt1D_FBk5Dj1fAmgE.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-SN9Dk4VkK2hWg3CKYzDiOHonGDgXBU9oYaHetsnN5QQ.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-3TTmIw4GXASfHVwImWxl04PQZ66oeZjrXhHJygO28Jo.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Ba-kRNfkWaN_5SdP_l5fZCzK0W4v0Y2bGE2qs2ecows.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-m2bpmt3che7xxfdvt-X1z8ioFnuRYJ0QX-X-ttj5it4.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-1il5g0lJiwTLja518a5aiYTmHNzrD90hgwXiyUuhwR0.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-ggVqEWqejhWoce6KFgVBAepaQFtORsi6X64F-V6oaS8.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-MojQRjbLWVjI8stYsAgdKV5pzfcpWPhnj04_ZhhDzJs.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-36fvOsk4d5W6EbeImTu0PY-jm_1rLMrmW6tfn6ARjko.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-8Xal4bPuH2H5NM4qDrBut4Smiqs_4WScCkWaPD13Grs.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-Dby6uZ9nbxo4YOUtZF_hSGYqfpW8dVrsFgdurGjOPvw.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-GRLGOKvX51tXL9Mw7baYnM6BXAojdqnpPX4Ei9R7j-c.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-8Rd1x7j57HV5MjRICzbajzbS9Ck1HHQai3SlQFP2iNU.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-4IAx-0htpOUW2IvxFyV4Ni8BrXaLZfX9_p9Bw-HKqp4.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-IHnCQLyRTisShLoyp9vgS9bTAdvdM5wKKNlsWJaEr3k.jar
    Oct 01, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-tqU2qu7-TVcWzgNomOU_BuO62Xhx6EaCvzhxx3f_OFg.jar
    Oct 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash 2efb87d71c733e11d12a74bfdf5e931ae9d15903c95266a997fd8b337c1f4ece> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-LvuH1xxzPhHRKnS_316TGunRWQPJUmapl_2LM3wfTs4.pb
    Oct 01, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 01, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-30_23_45_17-5325515204429877640?project=apache-beam-testing
    Oct 01, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-30_23_45_17-5325515204429877640
    Oct 01, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-30_23_45_17-5325515204429877640
    Oct 01, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-01T06:45:17.758Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:45:24.931Z: Worker configuration: n1-standard-1 in us-central1-b.
    Oct 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:45:26.504Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:45:26.551Z: Expanding GroupByKey operations into optimizable parts.
    Oct 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:45:26.596Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:45:26.658Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:45:26.689Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:45:26.720Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:45:26.747Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 01, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:45:27.201Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 01, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:45:27.278Z: Starting 5 workers in us-central1-b...
    Oct 01, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:45:48.337Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 01, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:45:53.849Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 01, 2020 6:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:46:14.536Z: Workers have started successfully.
    Oct 01, 2020 6:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:46:14.561Z: Workers have started successfully.
    Oct 01, 2020 6:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:46:45.187Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 01, 2020 6:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:46:45.332Z: Cleaning up.
    Oct 01, 2020 6:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:46:45.402Z: Stopping worker pool...
    Oct 01, 2020 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:47:39.652Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 01, 2020 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T06:47:39.697Z: Worker pool stopped.
    Oct 01, 2020 6:47:45 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-30_23_45_17-5325515204429877640 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9b8d4c80-af82-4870-9c43-ba3fdd22bd50 and timestamp: 2020-10-01T06:47:45.411000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    11.685

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2020 6:47:45 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 41.009 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 30s
107 actionable tasks: 67 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/plcjp76i5d2bu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1060

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1060/display/redirect?page=changes>

Changes:

[justinwhite321] Upgrade conscrypt to latest version

[Kyle Weaver] Revert "Fix up-to-date checking mechanism on Tox Task."

[Kenneth Knowles] Set default Python version for virtualenv to 3.6, the minimum supported

[noreply] [BEAM-9681] Add textio.Read lesson to Go SDK katas (#12941)

[noreply] [BEAM-10200] Respect profile_memory option and add memory profiler to…


------------------------------------------
[...truncated 279.52 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 01, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 01, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Oct 01, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-hxHL7B003DAn5hhJvOBsS2VRNDzkYSjI7Xmu3LvhB2E.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-4oonz_kumL_cJ2qb-Xa5rCMofj87Bd9XHQ_sLIEVDYA.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-HRlU4oEC7P5i3kwHBhwsON0vHih8AfY5OTPgdKydmzo.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-wsWHHi_NiL54IMhLW3mjXVBkv7dcUy-7DZznnswqFe0.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-evlG_MefN3WHSasnXF60OAJ3MJp81mNEHZHqIIoA9eA.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-mHXB1MBUCp1_kePvf7SECsaHKQUrj2_HtvV3lt-GpVM.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-IwXKT9pjhO82uCHcfbC_yOgfgWWi0YRsE4FCjk2SGHY.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-M46YeHv5Ga8MG3SfCMW4Xb6VNCiP6LmFEK7cR90NH4E.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-hxHL7B003DAn5hhJvOBsS2VRNDzkYSjI7Xmu3LvhB2E.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-R2_RDD19tv3qxsR3KAAly5o3XvKoUCbnfqbR7P-zLbc.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT--wwYNFfXF8pAzeumE40f89Xjo5FbUQX2LsyI1O18W2c.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-n1IoevSdAwTntbVPmfJBJeTslFW28YV9CL9Wxv006Kc.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-XRjvBUsRfb0Ne9bNyfbHxmPnRm32-oU_TWaT9EC_svE.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests--wY6DAVJz1b15CIPJvqFIda0-EYrfxTcSidKY_dfg_8.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-BEFYmtIAP5SU0jr9Jt1WrW3LHmHCxHEdyGaQ1bFt6qA.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-O5qwv2o2CnhR6eqm4_7cxG_9KpxZmQ2a3C6rVPK3Vn4.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-YW2bGgsDUhAEO32jdcGZvOiNXw2gcHZaS-9wzq0ybJU.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-8bFjEVsfoPaDyhrOzvMwjA2yJfrmoRTEFVwq64LFeYM.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-GwpCV-xnGJ5W1dfH94AjEXIrAllmObua_VeZl7A-WB0.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-z2pcJyPeCNWrJN1f0PAzKKZcVAy_u2KK1nxFmP2DPsM.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-qL04d_-4Pr9bJoEyURzLXe0cmuP8ivAv2RUeAX4qIfM.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9192971367191216936.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-VVtEGka3SBuf2A3I1mUMg81zeW4h6kYch31uNgUHptg.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-TyJTS71efSxbm-lC5fmX2C-5zQiUw26MUsMW7cw_A38.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT--17jbi2-gPjLBeLWGZoC87BRtzK2Xm1nxbLabIn-eqA.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-8GRdz3sJamJDVdRPu6rsLQmQeI532faARfqw98rRJcQ.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-U-h8stcgJfKbkbpAbtS4eUb1VoZXE9KUS11Cc6LxEOc.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-a87jqTWMWs7fw48XH0Y-MPHNt3pw5ao6PZ9V2oEt2yU.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-yCmiCvRsdcNZ4ki5cB5sQ8FEjDR7pLwvpZDbWJvLbCo.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-K1s7gOuVLHCcopzZ7p--7sJpV0GRoQL13A0mylCuau0.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-pY4Zf_ZKxzB8pncsQ2dE7g98_k2xf5nLdx39QYl7xvc.jar
    Oct 01, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-_oJ3NsmlIUYC7Ap72PBAJNxkXtkS4Epf8PkrC8tE6Bo.jar
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash a1ec324a860d4df57cd630b221f533142a43c2e7254b05b6338e308d2c797f86> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-oewySoYNTfV81jCyIfUzFCpDwuclSwW2M44wjSx5f4Y.pb
    Oct 01, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Oct 01, 2020 12:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-30_17_45_44-16062935913013719394?project=apache-beam-testing
    Oct 01, 2020 12:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-30_17_45_44-16062935913013719394
    Oct 01, 2020 12:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-30_17_45_44-16062935913013719394
    Oct 01, 2020 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-10-01T00:45:44.046Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:50.625Z: Worker configuration: n1-standard-1 in us-central1-c.
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.272Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.314Z: Expanding GroupByKey operations into optimizable parts.
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.341Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.410Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.429Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.451Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.483Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:51.940Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 01, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:45:52.001Z: Starting 5 workers in us-central1-c...
    Oct 01, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:46:15.020Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 01, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:46:16.921Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 01, 2020 12:46:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:46:37.038Z: Workers have started successfully.
    Oct 01, 2020 12:46:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:46:37.062Z: Workers have started successfully.
    Oct 01, 2020 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:47:05.727Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Oct 01, 2020 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:47:05.859Z: Cleaning up.
    Oct 01, 2020 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:47:05.947Z: Stopping worker pool...
    Oct 01, 2020 12:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:48:00.238Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 01, 2020 12:48:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-10-01T00:48:00.283Z: Worker pool stopped.
    Oct 01, 2020 12:48:07 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-30_17_45_44-16062935913013719394 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b0f618f6-91c7-49dd-9815-4c661eaa92ed and timestamp: 2020-10-01T00:48:07.370000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.102

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2020 12:48:07 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 37.095 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 51s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/rq4kvhrcsquqy

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1059

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1059/display/redirect?page=changes>

Changes:

[zyichi] Add nexmark python query 10 to choices

[Kyle Weaver] [BEAM-10671] Add environment configuration fields as a repeated pipeline

[Pablo Estrada] Passing project properly in BQSource

[noreply] [BEAM-10986] Rollback to shadow 4.0.3 (#12969)


------------------------------------------
[...truncated 280.51 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 6:45:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 6:45:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 6:45:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 6:45:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 30, 2020 6:45:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 30, 2020 6:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-zGr6hp_tOPjjTrGxPLWbUKxzR6jOJPm_s3yZ7SIIqEE.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT--6AOQwr_lhjcphHsAODaYhZx-kbR8xp48sqLixyw1lA.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-aDjiKG7-VmZf2ohhA8eEY1z_grDIci1FR5INnvhsAy0.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-MInXDxCmFQYZ-R3ru4OYGwsomv6wVz0p_k4iwQ77jZ0.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-mj4Nm0a98QMj2i5XtabA54lunoEnugIGXImzdJa32dw.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-o6_aBtCACqHcGlUMlIS8mlX0ui27m2KEEl31xwOrSvI.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-SFFsR7FJRxB6cnv6Ij2RQ8UH5xzkiW92HQ2iXr_lIjM.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-9ZJzYZd6YyT2kfkW9RaPM1OyOIbn9ERYt_5AHo6byGI.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-yWRtYmt4m4cn6CouWasm1vJiqn_bY6ygQt4v68-zFtw.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-_3lbb06nnS6YV7Yjqvcr5Z46LpcZT00AOC5IPuzVHQM.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-md9PAUi7stQajKKxgqF1PN2WNXqrTOsXeaQxI_sB8tk.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-rxA40I4iCtvAhYL6ewJ6wjmOw0c6pSNqVoLKHk-bsMw.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-zGr6hp_tOPjjTrGxPLWbUKxzR6jOJPm_s3yZ7SIIqEE.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-VrqIKQDyb_K1P9vQvDz1GgCV5VFsHb9KefXB3Yn04fU.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-sOLtpusI33eqrk0Rsigsj42S7D-F7aA9xywjhu7bzl0.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-3jCjDTdqKrK5BwefnPLyCe0Iqtrf45DzRXQZvjzT5DY.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-8pkolRL7me3Ub_GnipujJ3ckvwHDQM7vzrKc5LvOSlA.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-yJk28HO3tOqg5ojMVK0aMcMKh0fcrdtgcctJhnZZA58.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-w4p3HA0kJEN1ZgY7cWMNBwZr4UvZVEvC7La-T5JGD34.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-pbtboAXIm04olc2z9HU7leQdg5VKkeIwtvUGBbR7SYY.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1193377539281656684.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-tcZjIbOez4CKAfr4K6L9GliN5uUn8idrM1kE4NEcYSE.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-zJzxw7gRFkSnH7S5MkYwmpSXrxt0ZIlcSqY3qWKlUk4.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-pb22rXFZMBaoQoUr8aPUBvZKTpqOEIrwj9SiXuKd0mc.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-WQNZJhlKO4RUUadrVpT1GOU4GA3q9FeP3yZ1PGAZhf8.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-4GQo9Byht0TPo7DEGSH995JYzUoXuZ9Nqtr6huwwxr4.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-rhck1a7X0I3HmZKAyAdf5mpkJIsqCsfG69Q2JMrm5yw.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Lc46Pu7JFbPShoiPVGrAvBrOjndur8ppAQCHPRBsF8M.jar
    Sep 30, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-OWVBiXowbv03czyiKp-nwE5pHJgl0GvsE5j2MKM3l2c.jar
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-8Y1yhVCAAybTMBPL3psN8NAVSVs0WexrBogZfUkmWYw.jar
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-tFPR1h8-sRpOzDZdK1ME55kd-O6eqlGwKTkCJpuTME0.jar
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-TyxRh4hoSym4hOsAVn1UhUaBh5Z1qFb_jKDMnHI8Kwg.jar
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 30, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash c23fd578ef22c09ae1c7e4a0fac7f7dd4e823f540260a6886ecdb969bc9e6ea9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-wj_VeO8iwJrhx-Sg-sf33U6CP1QCYKaIbs25abyebqk.pb
    Sep 30, 2020 6:45:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 30, 2020 6:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-30_11_45_52-1626663165689262608?project=apache-beam-testing
    Sep 30, 2020 6:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-30_11_45_52-1626663165689262608
    Sep 30, 2020 6:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-30_11_45_52-1626663165689262608
    Sep 30, 2020 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-30T18:45:52.229Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:01.688Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:03.963Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:04.005Z: Expanding GroupByKey operations into optimizable parts.
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:04.078Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:04.154Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:04.201Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:04.225Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 30, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:04.257Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 30, 2020 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:05.140Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:05.222Z: Starting 5 workers in us-central1-b...
    Sep 30, 2020 6:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:18.873Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 30, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:32.997Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 30, 2020 6:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:49.944Z: Workers have started successfully.
    Sep 30, 2020 6:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:46:50.002Z: Workers have started successfully.
    Sep 30, 2020 6:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:47:23.138Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 6:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:47:23.554Z: Cleaning up.
    Sep 30, 2020 6:47:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:47:23.659Z: Stopping worker pool...
    Sep 30, 2020 6:48:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:48:13.466Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 30, 2020 6:48:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T18:48:13.507Z: Worker pool stopped.
    Sep 30, 2020 6:48:19 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-30_11_45_52-1626663165689262608 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3443c772-4ea8-490c-bea0-0034df0e51ae and timestamp: 2020-09-30T18:48:19.471000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.515

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 6:48:19 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 42.659 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 2s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/gj5obdeouapgw

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1058

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1058/display/redirect>

Changes:


------------------------------------------
[...truncated 273.82 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 30, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 12:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 30, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 30, 2020 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 30, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Osn9U0E7G_FpFG_KwJG2OXrb-40dUaRKuKWw24XDFsc.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-jtgeCNxBnSYn2hMfl7_t0DN6RFIFLdhyreSK8uxz-eE.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-dVvnf2CdGrD4sW66-cBlqiguhHHWpTiWvCqZZOLYF8s.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-qghle6w0vGz5GCTdFYbLE8p-CYXSltHEHLXEavG04gM.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gP38SmqAZ-wSOrjvdvORi_71KB8I3u9nGPzwEiLgrxE.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-bJIEL6d7-96rN7l_PynkItbdU6a8iZT3buYAgRSIHiE.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-ceLoPaONeNU89DWOxAk_J2iPUoL-80VkSnj5M2w_RhM.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-qqJoGKHyGMCTScR-FkV-azj9O4cscvh5fUKCr7-hovU.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-aLzwIPTd6FwPK_pkb1Xn519rPCWFT-emSEvjieY2FLk.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-SGePdnhis0gv0mWFODvP0-lRpjARguvJ-JgwcdzqHLU.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-L7BaHrMEJH3T3nzDtlOytftwBSf4w_DPafhpqeuJCIg.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7352304566873775243.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-VETHo8LH_r_9b3Ov-BsueRcmPjNKiLaiqroRi-g-I8c.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-cMO4PbWL3kdS96H6umeR8Znp_gc39WOF2Env14P7Ux8.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gP38SmqAZ-wSOrjvdvORi_71KB8I3u9nGPzwEiLgrxE.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-nsInsFISszx4gzy_jt4vzrvpL-VulGGPG8BWf-aRwBg.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-hJv1Rl8GHfzI75aa2iNEiJaNUFBedMUveqQM8hl7NOI.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-ZXG7ZknFg7H3VjC1AGiCXjhZuStdf1WuodKKg6Hm2pI.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Sw6lME1OrXbjJGTDsn90fAqBEeECZ5iLkF-vunhW1uw.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-bd8ciYcAr5biqhQogVCfY19fape-G_CKOhhlY8zlE4I.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-TXe1r8-CDWPgHqLw9nJQtT81zvaxY8nRY8USvkgLIZI.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-Ora8jtQMVjsUpI1we4r1Fm1LEPQlVZSqP04WYkOOCJc.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-InEnt0drvDmNfnRpXjj9oHvki1U5YtNQSwnj8UucgWo.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-guaRtWeHOZkH34mLCcSFUT_3tZ3G4IHjJVDEOj5IK0M.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-pQaqijK3NUIva_AOWDQg8F2Dg9iEKE-ZvVTNZcavg04.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-jeIApskJio32QDkYHE-eBB26N_5bMvs17KPrbbBdAdU.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-zNa9si9GHTgjNpSbKRM0RqFPEPWCTQajol-8WAd_GbE.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-9r9VUvI5pF_0arYyKs0tcifrLM84pA_O7jhl9AL7_L0.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-yjNW0AZERPtQDvQtG3TiIlKXXGJRW6njE3uE0ByUl30.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-g_h9h9ICUitXAroM_qfHF7jXHM2e6PC4mRhTPv3IEMQ.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-CqY4WzofWmtHZBP4ziRnXtAMkb6cywl038yz87ds8V8.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-UNt0atMulRAKqcd98lesmjx8qRZK8Czdz51zIOLbgYA.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-qndCLwDT0dk8i9Tr4eXzaAQGkHiVyXtiJ-cKihx47T8.jar
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 31 files newly uploaded in 0 seconds
    Sep 30, 2020 12:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 30, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 30, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 30, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 30, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 30, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash e9a85cc77fd3cfa16014f47d4fa309ee95d5db3f2198db91de2a75024e2d4ab0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-6ahcx3_Tz6FgFPR9T6MJ7pXV2z8hmNuR3ip1Ak4tSrA.pb
    Sep 30, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 30, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-30_05_45_36-14555014956868873490?project=apache-beam-testing
    Sep 30, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-30_05_45_36-14555014956868873490
    Sep 30, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-30_05_45_36-14555014956868873490
    Sep 30, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-30T12:45:36.563Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:44.118Z: Worker configuration: n1-standard-1 in us-central1-c.
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:44.915Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:44.963Z: Expanding GroupByKey operations into optimizable parts.
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:45.031Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:45.114Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:45.160Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:45.192Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:45.213Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 30, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:45.615Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:45:45.691Z: Starting 5 workers in us-central1-c...
    Sep 30, 2020 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:46:10.762Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 30, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:46:16.662Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 30, 2020 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:46:34.437Z: Workers have started successfully.
    Sep 30, 2020 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:46:34.472Z: Workers have started successfully.
    Sep 30, 2020 12:47:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:47:14.432Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:47:14.570Z: Cleaning up.
    Sep 30, 2020 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:47:14.659Z: Stopping worker pool...
    Sep 30, 2020 12:48:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:48:08.395Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 30, 2020 12:48:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T12:48:08.442Z: Worker pool stopped.
    Sep 30, 2020 12:48:15 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-30_05_45_36-14555014956868873490 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): af10f2c1-b72f-4933-97fd-5635d0562d44 and timestamp: 2020-09-30T12:48:16.002000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    25.416

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 12:48:16 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 52.673 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 59s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/6qo7g2vl424by

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1057

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1057/display/redirect>

Changes:


------------------------------------------
[...truncated 275.46 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 30, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 30, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 30, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-G4j8L1hb91zJA1mODNyIJol5fO6OIdYLWLRR93RQnw4.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-jcZL2HdwHa39fp2LtQMINfq_MP7ZoihtWLmGW6UCkk4.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Vugetv-LW7wLQhjQIt5uzyYEvFrkECcW0o9TAHkArmw.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-EhVtn4GBkm6vwVm78YKofJ-hECnW1MUrKWfJIEDhFTI.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-G4j8L1hb91zJA1mODNyIJol5fO6OIdYLWLRR93RQnw4.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-Vfi63ygK9r6gjlR8pinBccbIydRLlZSzigqHEnwRn4U.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-SQC5FFR6IhdmIBCkfbnu9ja7OcpkrCZ5oCjcYaH195s.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-mQ2bwJLwbjR_F-JAB9496a5pPKN8B1rGGkmEwgrUNww.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-yfXUCSr-U2T3TxwEkG2T6aLpw-VzOdQR6Yc2zaOXQ8o.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5277594357761847256.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-coLSbNUQHsphxMiB7s8x57I_BuJYRmuVrqzL_HuDZU0.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-Cau_0jvmuTUTvFjkrVDRiIO8sAObOZQyMhfOXajVjx0.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-w0K1qSf0WkgkybYy3E1qmMS1kON8pXOcMemimnUxQkc.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-PqZNPg5uJU0id6FPUW2riCji34UC4VGuOM0ysLVzpUk.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-mropw2GNTyStjUwAjw4Y70Fahzq47r4Oc3EydgTAonc.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-ZVUl8WA32vYVryLN1fp_mjGxa5eFrkTcMRlkFMpG6f0.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-2feGg9H3QA42MJp4ERRzGRcMQ37XWgmLRUImT_OxRLU.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-mH96JueqGDzrNzckvGbb4C9YcxMIaquQQb25HBS1EjE.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-biyyGhK0X5OYc6PQ_3dIW19PhJujBptRx4G-gPKirQ4.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-HzDgdbuSyccxC4lYPb-MzV5gsdpSWIQhTBGjr_XkuvU.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-xCFZoD9j9zNo8qhogrdZlzGVf6ui_Iccb5TaWVC5zAU.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-MFwBj_xuN0F1ar-WyHb9pgXtsOM9_HD_2ilxKRVs8J8.jar
    Sep 30, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-pp2rO-BaxXjV2D5Dnt9MF1mg7Hm6nSj3MLUvkgx2ACs.jar
    Sep 30, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-q7NZSFP7UBqAhHtCjsKLKoO1QEBu_6wGrVLEdzyovRM.jar
    Sep 30, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-js9TxKIKFNE0XR5bH07zAjIogHEaahMbpmmkHYb7chE.jar
    Sep 30, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tmj29aGgloqT6ENN0D92aLCO1-vz5x5vdf4yEp7VlzE.jar
    Sep 30, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-EzMJIhGCizO1xiCBV7L5lUetQ1hAyWB8nNGDZBalq2Q.jar
    Sep 30, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-72UJCxu8IKYv4skYSKR81x0WMVDcifNjpgGYg-TBAJc.jar
    Sep 30, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 30, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 30, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 30, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 30, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 30, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 30, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash b6d7dcd795139ca23d5ed43f6cb949440a8c4eda6c19702ca47ae7d5e55777b7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ttfc15UTnKI9XtQ_bLlJRAqMTtpsGXAspHrn1eVXd7c.pb
    Sep 30, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 30, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-29_23_45_38-7878334072495782317?project=apache-beam-testing
    Sep 30, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-29_23_45_38-7878334072495782317
    Sep 30, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-29_23_45_38-7878334072495782317
    Sep 30, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-30T06:45:38.559Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.137Z: Worker configuration: n1-standard-1 in us-central1-c.
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.716Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.790Z: Expanding GroupByKey operations into optimizable parts.
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.822Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.900Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.936Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.963Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 30, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:47.998Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 30, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:48.817Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:45:48.904Z: Starting 5 workers in us-central1-c...
    Sep 30, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:46:00.427Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 30, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:46:17.154Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 30, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:46:40.481Z: Workers have started successfully.
    Sep 30, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:46:40.530Z: Workers have started successfully.
    Sep 30, 2020 6:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:47:12.583Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 6:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:47:12.767Z: Cleaning up.
    Sep 30, 2020 6:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:47:12.868Z: Stopping worker pool...
    Sep 30, 2020 6:48:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:48:03.659Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 30, 2020 6:48:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T06:48:03.702Z: Worker pool stopped.
    Sep 30, 2020 6:48:09 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-29_23_45_38-7878334072495782317 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f4de1de7-cde3-4fdf-aae9-7d3b27a534db and timestamp: 2020-09-30T06:48:09.228000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.463

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 6:48:09 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 44.669 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 52s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/j5ec3fcv3funy

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1056

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1056/display/redirect?page=changes>

Changes:

[tysonjh] Fix Java nightly snapshot build.

[sychen] Fix failed GroupIntoBatchesOverrideTest

[noreply] [BEAM-8024] Add JPMS E2E test (#12899)

[srohde] Adds duration string to the ib.show and ib.collect

[Kyle Weaver] [BEAM-10953] Add errorprone-slf4j plugin.

[Kyle Weaver] [BEAM-10953] Fix errorprone-slf4j errors.

[noreply] [BEAM-8106] Add version to java container image name (#12505)

[Robert Bradshaw] Remove experimental declarations from fileio.

[noreply] [BEAM-10677] Fix @SchemaFieldName in AutoValueSchema (#12520)


------------------------------------------
[...truncated 278.59 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 30, 2020 12:46:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 12:46:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:46:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 12:46:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2020 12:46:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:46:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 30, 2020 12:46:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 30, 2020 12:46:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 30, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 30, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Qu8vNawVK1SacREE87sOYuQG-VMa6pqiWnQyUtAoLYI.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-oOZ_8sfWmEHZO0FccBW6ePwcJ2n1dEKlX0XCl4rnUZE.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-BNDklvMapjBbD1Mz-ouHcOAxm9oqiUuwVVcPYY9VuZg.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-vt7ywL-YEC_Z-1yqK05WtSNwPy6tQt21lp7p_X_YF4A.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-XghNs7OYlKgXIiDC9cPMjrde5TaX7gFoApD9ip5gGR8.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-K-66a0DFwLWc7Q--DpfoQtH6rxhDBgaFGq0hEYJhDyU.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-m24-cC-Z3n-4TQyNwT2YKK_PDnG3tyaQpYFqtqDXddc.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-EvZhiYneNWZmZ1XVMdSpPQgHVptkEqiwYBSCFeDiWuA.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-6RBkjz7sTk67gWHa_7jyDEfPPpnwatps_BNqKbkY9mg.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-9aVu72ZiAptuRRhcpvbcdPkK7J2Qx2bQ9-Onx7SfWtQ.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-HRZg9ecIIrCL4-u_FVFzZp742QT0yr3BfAKOeTMyO-w.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6292497781437164006.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qnUM1mwqgdRIgcKlMV6VXfbaMfIDwFaTxl-GmdimXdI.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-TX81Etxzk719Cz1t7XXG7WfGYocRQHjtjgl8hBdhjCk.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-7HgJaDaoz44SEq2GJ1kjWjB4OO6vzYfrB7FrF68n714.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-1H51Tg72MKw_X4lqBfFQU_ReNkCXxP_x8M2wRzEMPjc.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-D2H55aQIrHNV1EdcrTpThaVT64McxzKjcdjwpHJwboM.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-NcAXLrybYq1pbnFGueHRJiWK42D_T9UpGThqDcV9tyM.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-jetHsJJcIbaD9CMvNQdCZzShbgd8Eg1YijhsQn2fXy8.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-yosL-3tHj0SNwRNYqpw1b5tDEG32IRape7wrmCotmDo.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-rMjurNqj5nWWoMKjVHwBrgfgUFLrhJi3oUTYpOz7-Js.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-qIF4IGg80jc-B0cTSxh-sSQ3QQhrGfEc6nNAwKIis5Y.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-6cbTAM3oRELUmfyh9sCVbcC0GdgE5aLh6DlOh53mEWw.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-ANYYBfGUjwX8kHZ88hrDvmv2vLE8_VnoK4INOvq2JKI.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-k39F7YT67oUCMd4XjRv-cG45olhzULxXrCY0XU8-ccI.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-WQb4ac3k_wGGHgZgo3ZzVqJMWTQTOJN11ZM5GNrUgdI.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-Qu8vNawVK1SacREE87sOYuQG-VMa6pqiWnQyUtAoLYI.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-oJFbHDiP91yfDq62VgzGbsbPsOxMRw4scYq6rXQaj4o.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-_eeyPBZ0DRGgTSIVJN4MYUwjMO0v6O72iJ0DVVk8vzo.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-zbvSYDhu2VimlJElH88VG4Isib81Ui5poftcbQ5kuBI.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT--XdPkjONBMm3KjXDOzZflr4WFU8aXS7ZUY2x8qWg6x4.jar
    Sep 30, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-tGPHqSSQDkwPtP0cPhZpH2G-W4WR3n-xL6k4z30B33I.jar
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 1 seconds
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95918 bytes, hash 39fdeba08e515c5a74400bda31396ac7b6652d57167aff6c8c91ab7d94c1b044> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Of3roI5RXFp0QAvaMTlqx7ZlLVcWev9sjJGrfZTBsEQ.pb
    Sep 30, 2020 12:46:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 30, 2020 12:46:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-29_17_46_26-9821662910935396458?project=apache-beam-testing
    Sep 30, 2020 12:46:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-29_17_46_26-9821662910935396458
    Sep 30, 2020 12:46:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-29_17_46_26-9821662910935396458
    Sep 30, 2020 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-30T00:46:26.813Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 30, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:33.988Z: Worker configuration: n1-standard-1 in us-central1-c.
    Sep 30, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:34.852Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:35.062Z: Expanding GroupByKey operations into optimizable parts.
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:35.111Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:35.284Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:35.342Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:35.397Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:35.442Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:35.980Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:36.096Z: Starting 5 workers in us-central1-c...
    Sep 30, 2020 12:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:46:41.871Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 30, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:47:00.044Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 30, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:47:20.127Z: Workers have started successfully.
    Sep 30, 2020 12:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:47:20.187Z: Workers have started successfully.
    Sep 30, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:47:51.469Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 30, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:47:51.734Z: Cleaning up.
    Sep 30, 2020 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:47:51.849Z: Stopping worker pool...
    Sep 30, 2020 12:48:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:48:45.546Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 30, 2020 12:48:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-30T00:48:45.607Z: Worker pool stopped.
    Sep 30, 2020 12:48:51 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-29_17_46_26-9821662910935396458 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a8225d87-51e6-45c5-87c4-34811da21345 and timestamp: 2020-09-30T00:48:51.895000000Z:
                     Metric:                    Value:
                   read_time                    11.559
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2020 12:48:52 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 41.461 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 36s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/nmz3kjxmi67j4

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1055

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1055/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-10971] Redirect contribute/design-documents to cwiki.


------------------------------------------
[...truncated 270.79 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 29, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 29, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-S06h9KYUxdf1fBF-19y1OfwUSev7HERLGmwuVsIjbME.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-Xkcrv9cnsd090y7b4cgb8Z7yqqW3WsAy1gO_cP0r6Fc.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test332511998639500270.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-5le9ISdClL_9yVLI-GoVl4Qzx7064AYwA4VyiySJcuM.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-BJ0GHtunGji1mEOXLCbZKaNudgnSat-eXAi_IYei5xk.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-gL2nmzax8WxxRbP-fmGjiwhsyp3Cw7na7EzXclkF4gw.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-EB4gUQDR4KO-ZgQbIA5RFC-C73Y7gZakO98zqs60A1o.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-J4TjeFovjl2Ab978ZAC46jg0YoJm24VzU8q_Pzxk13Q.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-kVAUSxSXkZ8bvVE48rsfNC3uDu1MkjETCDmkldtFhso.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-of0WpIk9ST5x0cIduRBg0zJiw7qN8ak6Vtpn4aBE6Fw.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-ZbVdrMdp8zRcOyyeQXxl2CWiv8QQtGdY9m1P6EG-eho.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-DHGO1exNNd6mjpuglbGrAMKaw9np9zc0_6Uqf2OBWSk.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-gdyOSs9_HxDNDTcB2S3BKTx0jCJ-aektLvDsnxeSDMA.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-7IkCxE_rNGqgjHIqMGmt9kO4d0whFw_qwIrJdonrCuM.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-n7zonTzXVoDqW7fU83x2w9rFdRIoyHhgNcrKw6YIcvg.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-OzI3gMzq5HkDmnBOFLIh6fadIMU71HvX_vtS1JeeBgw.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-0aMNet4pr_9nlByVkEhCMsWe1C0cZ_2ZwJbg8_NwB0A.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-7Mw0MvFBw1Fn5CYUjbZIByP0F2jUFXjaQqTNV9gSgnM.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-S06h9KYUxdf1fBF-19y1OfwUSev7HERLGmwuVsIjbME.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-Y9y_mYphv9cXAo_VJaeDMnedmUgA6TVbcjjYdNgoUvk.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-KD5akkfEmu0R2Akxdpy8EDnDx0Sw8fInvHLe56iITnY.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-7YOGY64KozuqBQO7t5CE4WbakEn2IKJZmCkK_bxY2NQ.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-VWLN9C1vbKwEYDDqhUJRj2pJ5pGMGZCO0UYKzf2eu3k.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-K5hy_lt1m0EdRSFutNxACRSBJH4H9xMzbCKLZZqsehc.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-C7opOM-V1ki1TjwOQrCFMQtxkdd9SfGFmrpByy76uLA.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-h9170ik7GzOUhQQAOWLBi23e0zlkl2eTz8xLIR0IlbY.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-CcGVeLgvGN_JMHS2rezxqFaW9sADJXjWv001fcWNfV8.jar
    Sep 29, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-w1m__QuXuBNxtNkh-QvirI04GV0qo3g0Fyuz_CIjAIo.jar
    Sep 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 29, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95915 bytes, hash cda34bb05d98b951918a1c71e2688661f852c5eb92ab1dbaada324599f39b36f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-zaNLsF2YuVGRihxx4miGYfhSxeuSqx26raMkWZ85s28.pb
    Sep 29, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 29, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-29_11_45_21-8437957726531856427?project=apache-beam-testing
    Sep 29, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-29_11_45_21-8437957726531856427
    Sep 29, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-29_11_45_21-8437957726531856427
    Sep 29, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-29T18:45:21.344Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:28.467Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.156Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.196Z: Expanding GroupByKey operations into optimizable parts.
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.225Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.308Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.343Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.377Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.401Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.906Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:29.973Z: Starting 5 workers in us-central1-b...
    Sep 29, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:54.824Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 29, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:45:54.861Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 29, 2020 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:46:00.301Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 29, 2020 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:46:02.138Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 29, 2020 6:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:46:18.204Z: Workers have started successfully.
    Sep 29, 2020 6:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:46:18.237Z: Workers have started successfully.
    Sep 29, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:46:57.711Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:46:57.843Z: Cleaning up.
    Sep 29, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:46:57.909Z: Stopping worker pool...
    Sep 29, 2020 6:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:47:48.477Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 29, 2020 6:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T18:47:48.515Z: Worker pool stopped.
    Sep 29, 2020 6:47:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-29_11_45_21-8437957726531856427 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c3bee6c0-1ca6-4377-9520-e00d88bb38fe and timestamp: 2020-09-29T18:47:53.732000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    20.874

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 6:47:54 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 45.778 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 37s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/2wydiepvvmius

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1054

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1054/display/redirect>

Changes:


------------------------------------------
[...truncated 268.81 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 29, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 29, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gSfVaW0uPKTdpAiv8pU8usJukjI-9DRm4pYNRsp1pvA.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-XumsOKQ1iKrwCkKKPrLUGW1L1NDhzfuOwwTXp7RW5QU.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gSfVaW0uPKTdpAiv8pU8usJukjI-9DRm4pYNRsp1pvA.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7113768243728609945.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-jUAeoP6DcJlRxqpqajXyYmLyWRDQFvsbZaBd0u3gV60.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-vAXAwz3m_1I6B8wc4uTG_j0n8ZoLXC41QW9d4UPtqEM.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-w5qpeXjdsrFueQgcGvI1xHIj5xgEe0r_eMP2JYm9_IM.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-t_frMk49fHKFdpZZA3RXPwklx1M_BZgSydjUUfmS9cs.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-cwrTo44YokN6qHhwYLpsq987rYULTGQpvN6R2sy9-N8.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-S2h9G-Yjiqd4nVYTNGRWvFBXiQ3cwRFrclPan1qDjOw.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-L5L_2b5boa68rhBFWkheb3IbJsKzv-WiycLyXfSFGuY.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-zBTJEmwi--H4O2OJm_BeT_VxK_NbMAlsh34cPR7nE30.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-3fCvDoowKklTFZAbqc0N_bS2WIUJ5tb8IHurMJMR1H8.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-meTkSMOzYwwz-W7zjn1o4c7g3Cz_N0fy-qxdDPtxsBY.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-BtS2mh3OJlikPYOSB58lIDxlNIoVMuLe5icq8_OWdcM.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-NrTCTJ63l0BrlrdSlEVOBRQ4PqBpY7QLDv_WfL7_0NA.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-zM1DuKqRCjEu4wcvZodUEnWkkfuDLqx31L9dfhOdtNs.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-3XItXmIs6XoUcZ5mOUb3_JWV2h-AtjzNadRoV5GWCRU.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Lsr-ialuUw1pXogNLc0U187Klyf6dFFKUjjFaiOgrWw.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-M0Oo2APLsE0O7WE91_4uZydk2rot8kxHGas9qnNeJ6Y.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-UWXmskVUU6rRQHretEvvbPC5CziDSsQMHsO7fGZXQWc.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-Fl0HrfI1huah3yWcZznc9tuK3Bpu69fCuD7986EOJFU.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-zl6J62pe88KSB0kTGjEQA71KOToavRA4397tiPpCK90.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Ro2Oyapss4EoR6e0kjVFi-WDfsDrcm9JDjpExsAxgOg.jar
    Sep 29, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-IhSbZ5fO0K43TgFuJjy_gh2WigqrX1im22VcDIhsce8.jar
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-SxGrUaoBHFThaxRCUEh4SoGc4i6XrDC1TzV9tjFG3zM.jar
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-iPc0b8Zsff2rN8tNs_p9kOxHiVFf-0FN8F6pSZVn4LM.jar
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-woVqDJNoBq7SRm9lNN_KJeamXL-PoG2iDaMTjKUDSic.jar
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 29, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 29, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 546816d0bb16fc952c44ccf73672e1042b655aee1de86fb98f321d8920928b03> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-VGgW0LsW_JUsRMz3NnLhBCtlWu4d6G-5jzIdiSCSiwM.pb
    Sep 29, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 29, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-29_05_45_22-2927391320876390192?project=apache-beam-testing
    Sep 29, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-29_05_45_22-2927391320876390192
    Sep 29, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-29_05_45_22-2927391320876390192
    Sep 29, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-29T12:45:22.298Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:29.714Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:30.595Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:30.638Z: Expanding GroupByKey operations into optimizable parts.
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:30.737Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:30.860Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:30.893Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:30.941Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:30.977Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:31.456Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:31.533Z: Starting 5 workers in us-central1-f...
    Sep 29, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:56.992Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 29, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:45:57.865Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 29, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:46:17.464Z: Workers have started successfully.
    Sep 29, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:46:17.501Z: Workers have started successfully.
    Sep 29, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:46:50.837Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:46:51.018Z: Cleaning up.
    Sep 29, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:46:51.105Z: Stopping worker pool...
    Sep 29, 2020 12:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:47:37.021Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 29, 2020 12:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T12:47:37.076Z: Worker pool stopped.
    Sep 29, 2020 12:47:43 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-29_05_45_22-2927391320876390192 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0b148667-cd7a-4d60-8394-344020d3fe81 and timestamp: 2020-09-29T12:47:43.779000000Z:
                     Metric:                    Value:
                   read_time                    13.228
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 12:47:44 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 35.529 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 27s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/frfsy4o66wkzu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1053

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1053/display/redirect>

Changes:


------------------------------------------
[...truncated 271.55 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 29, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 29, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-LwpkD_TEfl6ip8ABur9nemM1xuVKbkLEJ4uupnipl7A.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-yHHNbH9WRbBUTqY2tHzZPy-3k0RmM8vS5Zk-o8-KLDw.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-lLEklKVhQILrE_8Pof6sNgxYVw_2O77CYpgp2uC5MrE.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1987679953031993710.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-o2rCEXOrEc-qVK5k3IjBO6s7neUa2_pGbb-W0q8U9vU.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-keoPftgi8Ob1SHABMU5kd0E4YSMYeSDZWnl7K4CeVuM.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Kt2GbMYN5cysbk84W8vU8MmmmrRPshSA72EOoao3Bpg.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-wJD-oiS_y4Ue5GTNKWb4endz5Wj6scruUk8syFQ-yCI.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-jR_ACGbzRXCAJoyYI9L5kFvokmILQ6BtREbTqKxvmjs.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-ify5JnA0PGnL72g4tFsu4E40QJjKS5W9xEdS7tQ99ls.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-I63IGrWs2dF3HhNMASUswU_-_z7PSZbl6AvUbCjOH2Y.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-mbFlByXtlNpnpYTFJ6snJkSipY7J-2dN-tfN1Bm1XsQ.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-whtOcV1BsL99mTayMnd_m31oPDE5AEHgYycrRBcrYtE.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-Cg70QzmmI7Q0Lp6T1HGmlOtIwLPkmYrWVJApEJRpOsM.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-d9OWel8tQ2u5H0GJOd-C7F9xjr1N1RTWkYqeqRC6Dlc.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-nQLlWGsLe0JxUzyZpol5N_JBVhLfVM5BpSQ_F8UhwSA.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-CdawxFGUYm7vKUbXYqKR_WZFF-n8a_U39XDSWATl8U4.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-LwpkD_TEfl6ip8ABur9nemM1xuVKbkLEJ4uupnipl7A.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-7EtcMgsVwna3S083GlC2qRnoYxVa9GGJnCpGVnh8tjw.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-QQbs2XX544yCZ5jO_is5kXdX5xb0Bk_0BmccsE5mUOk.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-UltHgTB0AnmVme3OTwVo7GccEBpYeK0JmOGotp6G2j4.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-HEpDvKfWTtXRba6VfMrIPG0rWd2spy8ieuQNotcCW-8.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-K-18gMb8RqFMOzer8rOHlnIa49lUv-ZQ_dhQ7NGy_4Q.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-v-pcACkitNXmFOnaj7YoQVMBO8l3eL5zJXl6GMwA65Y.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-ZdLWmuv5tNglwNb1rXU3-vRdx_edDjLEHZsMpEQL1us.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-VW33o_Im8QYBOHZWat5ZtbuVdWx3oBKNx5tAKR6F_p8.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-3GCmB5o7COCgbC1sMuqdO41unzg-8Ov9dYL8heKkKkk.jar
    Sep 29, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Do7qUeK3LfZ5kh6q501AKqGRys0YYJsrRA_gh-8bWNs.jar
    Sep 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 29, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95917 bytes, hash f4284a3b4b874c004d3aa40e6621e8fc24102288f1a40e234190c13566a85e96> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9ChKO0uHTABNOqQOZiHo_CQQIojxpA4jQZDBNWaoXpY.pb
    Sep 29, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 29, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-28_23_45_20-11414632088283701576?project=apache-beam-testing
    Sep 29, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-28_23_45_20-11414632088283701576
    Sep 29, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-28_23_45_20-11414632088283701576
    Sep 29, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-29T06:45:20.234Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 29, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:27.284Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 29, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.017Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 29, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.141Z: Expanding GroupByKey operations into optimizable parts.
    Sep 29, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.172Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 29, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.248Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 29, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.286Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 29, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.322Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 29, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.347Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 29, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:29.949Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:30.053Z: Starting 5 workers in us-central1-b...
    Sep 29, 2020 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:40.652Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 29, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:45:56.782Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 29, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:46:13.108Z: Workers have started successfully.
    Sep 29, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:46:13.146Z: Workers have started successfully.
    Sep 29, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:46:44.324Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:46:44.470Z: Cleaning up.
    Sep 29, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:46:44.538Z: Stopping worker pool...
    Sep 29, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:47:36.073Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 29, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T06:47:36.111Z: Worker pool stopped.
    Sep 29, 2020 6:47:42 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-28_23_45_20-11414632088283701576 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c3f05d1e-f435-4bc9-b303-c153dcf6d827 and timestamp: 2020-09-29T06:47:42.389000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.624

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 6:47:42 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 35.218 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 25s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/kmkr2so27ntmk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1052

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1052/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-10978] Fix bug with map type inference.

[noreply] [BEAM-10882]  Update Snowflake docs (#12823)


------------------------------------------
[...truncated 270.41 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 29, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 29, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 29, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sZf0FsJQdGsa1iI9LYD83s32hZYZfG3VTpAZoTHkO3c.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-70Psw04o0PeDVMYRp9-2nWhBMGwhn0GFIWWgVDBpWeQ.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-2d6RrgF3AMWSWApUdyd04ju-uorJy8tRbLFUjDN8OXg.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-TAkjyoZJE_e1Hw8QkGM5u-98yOvbsKMrTnfEY1EWUIo.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-9g8AGXyPesWBY2RWGpZd_rPHswEjOA2XwmeY9r1oBOA.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-EUwi8SK1wbX1y1-r8r0L5vVxYewxELI7XsKRJq2xg7Y.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-9Lde-q-lgGXqYVZ5VebvqkoiwkAZ9rzLI_sb-wd4QpY.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-nq4ifGCLejmAmDT03shQ_4Vt1GblRZ0Fhb-r0xt-tqA.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Cywi0Pl1nZG4SXV8lf9AFW0LLq4lVpCUS0i_uYmBTBI.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-MM3rHTVCW1jLKQFdBK1Ag6RGczBdNxVJC-tCayFH6IM.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-Ka-peIj93YT-Qj7csbQy_FBJNfcEj-ONFepx9qwzzX0.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-3f1XD0i9kVRuBPgYHAtM_UCfhuqwYHIMb5Cktw2baTU.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-qAHRfbrTELZl6lLVkcb3zKxMidZh11F1lLH93chyjOw.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-pCv53ldZfgkb37w9Gpp2CU3OaES0-huO9E97kS1PzHU.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-klh3rMtvqKmAFMlHaK89G8RUe6pI_--e5_UYCrnv6MM.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-LVBhpBFwupKsBDON12HESob04QGDaGXmI4tAkHhHOl0.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-vGDi7mtL3lx6Vkb9GPsENRtuWI1_O8KVQ4qew63nsow.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tC-XiFbtW2c9eN2xTUu1ZMmET4oAMArMoS6KcYT3is4.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-qxrEf9-uitO7NJNe5vnWbiXHwRGkYI4R0dNj-0W4nOk.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-pGsMFkwKCrrFjyPRxJtjD4eDdMWjaMN0mZdfAzcZv0k.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-obVEL_pSZomw85YzQ3sgvXyFPO4voMRh5tp8kZoQfpo.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3239215713181194417.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-zOMRRZXuOVLbquHT0Wp5rEQLoVmOE_xyRKbKInJnMr0.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-JLGsEAF3W5eK3Tmxp6sbo16I7AtZeZZPYyoMqSWULac.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sZf0FsJQdGsa1iI9LYD83s32hZYZfG3VTpAZoTHkO3c.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-lK6TgY5WMARqb8dxxBe-CW2fnp5LqAgu5t-VqZ3toFA.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-NTPf8SdGnAoEUwg_59vKjj-q2VPstb64hiQoBO2RzsE.jar
    Sep 29, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-6Hu1X5bHHa2v2SWKr56itgUwYpBvdoCrDTiWwtFqDcc.jar
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 608bde57cd5690610644171f2375cae7cd994af41d195e27332d5844173b0eb7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-YIveV81WkGEGRBcfI3XK582ZSvQdGV4nMy1YRBc7Drc.pb
    Sep 29, 2020 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 29, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-28_17_45_20-13298363548966251720?project=apache-beam-testing
    Sep 29, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-28_17_45_20-13298363548966251720
    Sep 29, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-28_17_45_20-13298363548966251720
    Sep 29, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-29T00:45:20.789Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 29, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:28.870Z: Worker configuration: n1-standard-1 in us-central1-c.
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:29.595Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:29.667Z: Expanding GroupByKey operations into optimizable parts.
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:29.710Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:29.790Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:29.854Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:29.898Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:29.960Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:30.370Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:30.466Z: Starting 5 workers in us-central1-c...
    Sep 29, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:46.978Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 29, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:45:54.462Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 29, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:46:14.603Z: Workers have started successfully.
    Sep 29, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:46:14.641Z: Workers have started successfully.
    Sep 29, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:46:48.914Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 29, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:46:49.085Z: Cleaning up.
    Sep 29, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:46:49.177Z: Stopping worker pool...
    Sep 29, 2020 12:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:47:40.648Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 29, 2020 12:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-29T00:47:40.706Z: Worker pool stopped.
    Sep 29, 2020 12:47:46 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-28_17_45_20-13298363548966251720 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 729239e1-bd09-4e15-ad15-4880c1ad8f1a and timestamp: 2020-09-29T00:47:46.172000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.705

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2020 12:47:46 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 38.561 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 29s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/my42z35qtzvyq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1051

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1051/display/redirect?page=changes>

Changes:

[noreply] Update test_get_python_sdk_name to supported python version (#12950)


------------------------------------------
[...truncated 271.51 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 28, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 28, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-h3jjyUjFnRAN2R9XUn9Y8jfdZ5hu_K1BGJXc00h89mA.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-ucOLTYfKO64YPc8GZi1p3pNZis89otR9OBJS3G4G2GM.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-5FssLmcgdBkSsr4gNW9atsY5a1ZEcsLTZC4-tYwMdQY.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4559249131501615629.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-kxaRdDrOFdJVnv_BNTCcoxv5Lg26HebFfql4QfYjhiI.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-F65FPApNG6lt6Qw5X02wP8eUccuUtVQMPZq4ov02lCI.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-jRyNWOYfuH7fUi-yJp3kgXyFgw2rnEPM9rEcmAbz5Fw.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-u1w6A0MG0zX7CqeZr9IVBFtgFdX_Zm6Sa6I5PaCszuo.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-yAfV6aOfzdeZlBDWlmJbKH8LyevpPfUfud5W9V_2O4M.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-UkCmVgfiT4ctRJtzZaWhQz7xS8RoFYhq5weW-HvUsfI.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-io4_DjOhu9L9EQncIb3jbT7OIk_z26OiDSzWvk6m8Rw.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Qf2V6ojWm6zzPBcPg5WZE2QPMANnjM1BB0ENuhx_pu8.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-OfRaDGPr0vybbYTaVlqv3-eS9cGIwj1yKgRy89nKIZo.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-RCMnZOdVB9SFdJZ6PBylAstJ32cvJBuIRSpzFXNYhAE.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-ipWdHXqYznsChNpHM6RrHW-_z4vwIf4O28RAeAWEQic.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-HKdipn-ott8ce9DWHzwMpliyjmxier4A63ByFUXkuiE.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-l_79uoMT_nZ9pyi0hZO7NoxGYePua7PiivcMP73BVyM.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-0gh9olCmYnlfvluCSZE81tqnj7RIGtVmZeq12-mGMVQ.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-qadZA8zVeriQodbJf9ZKLLUEa3EDHCPnxyM5_AYJYOw.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-oiHtovHIUho2HBjAoePXdjiuxOVyRSEuueIjJ-RMc3E.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-hWKQyCmMQPP_BYCIN59vz-JfcREQRto8Pil9IXdFKuE.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-VCoWym0pIu2B8K1no7J5hkni7CxUFXEBw_MppxwadOw.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-odwr3JhhtQ9OQKZDuB_btw9sXznq2WvnvnAtrl3GgNw.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-f3CRY6LzfxkIj1QBdDJHdiKvaPI4aE66XeDwpZA4w0Y.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-YXdQS8iaPDoKu6Pbu7RtEZUabXdcPT-ZPtFXahc_dzo.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-oDM1nuirkreipaNml4WpLFGqP6TsD2JYOMaBYX1Q7ns.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-h3jjyUjFnRAN2R9XUn9Y8jfdZ5hu_K1BGJXc00h89mA.jar
    Sep 28, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-8BBNjHE9t0a69IWa2gERcLhTDpWR4kgBIvmvuWQ-deY.jar
    Sep 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 28, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 28, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 28, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 28, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 75e897fe9eee631182ec22ec4da96e78d91a466c249b631b16a1bfa73adda0bf> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-deiX_p7uYxGC7CLsTalueNkaRmwkm2MbFqG_pzrdoL8.pb
    Sep 28, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 28, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-28_11_45_24-15917909757904601353?project=apache-beam-testing
    Sep 28, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-28_11_45_24-15917909757904601353
    Sep 28, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-28_11_45_24-15917909757904601353
    Sep 28, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-28T18:45:24.420Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 28, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:30.947Z: Worker configuration: n1-standard-1 in us-central1-c.
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:31.755Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:31.798Z: Expanding GroupByKey operations into optimizable parts.
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:31.822Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:31.889Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:31.917Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:31.950Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:31.980Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:32.436Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:32.498Z: Starting 5 workers in us-central1-c...
    Sep 28, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:42.141Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 28, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:55.833Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:45:55.861Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 28, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:46:01.130Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:46:21.321Z: Workers have started successfully.
    Sep 28, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:46:21.370Z: Workers have started successfully.
    Sep 28, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:46:54.295Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:46:54.444Z: Cleaning up.
    Sep 28, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:46:54.508Z: Stopping worker pool...
    Sep 28, 2020 6:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:47:45.709Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 28, 2020 6:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T18:47:45.752Z: Worker pool stopped.
    Sep 28, 2020 6:47:52 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-28_11_45_24-15917909757904601353 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 428d30a9-5a80-42ab-bb88-e7b1021591b1 and timestamp: 2020-09-28T18:47:52.646000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.694

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 6:47:53 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 44.23 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 35s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/2kxkylrxuobmm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1050

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1050/display/redirect?page=changes>

Changes:

[nosacky] Fix up-to-date checking mechanism on Tox Task.

[noreply] [BEAM-10481] Ensure registration of the accumulator occurs. (#12850)


------------------------------------------
[...truncated 279.78 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 28, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 28, 2020 12:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 28, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-9uVLE6MNxJh5fq9lkd0JHnb2_ngB7NufXk3fzatLX14.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-dEFwGkkr1HKF3rmvwUjWAyrhxuvetft8fnElrI2p28U.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-HnI-IAKH7z8XJyYaJPLpznEhPZ3Yj2BVfh0g5BsXcl4.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-1gNkBYes3uXTam9-oHKTqsOU2ZgWJN_YgIUgUCKJlHY.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-MqJyUYJKWMMEPFij-vZfLhgy3wj3UoroMHeH87lvIaI.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-C3o3dR9T94TTO37TCqO0Ue-gT2dVhkrXgfKoyg5XwTA.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-e4LMjRBBpzNWxdx-p-Je8qmHtIY26buWId-gKgRr2HA.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-VgqH2C-zTYZ1Io76rZEDe8piQ4KbyoYx6teh3xxh_CY.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-nttlflT5-prteLwhDGu_odRwuI81J2FDKNgHX8Xn-m0.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-h4LublY0qLDsJrZ2XERkFOj1TGxGyKk6ddQeSvVRzTM.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-KBmjD_4P2BNg-4Efv1uWlvc-Guwo3eZoMYoZ8tKtEIU.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-PL7oE8EkImnsPzq07ftQLAgwz7FwuOckNOfdNzchIs4.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-b7G0-6PUd_NUcz7yjNDH552KLvO03srWcZSvHThq7pM.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-hL44GhQg6xhkF6s9l5axN2QWlcOHeLCRoKMyO3_W5r0.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-XzIs0uEZs8iItazf0SQ22cPG7gaAt5tLb5FT4rinwZc.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1500760170008985425.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SIcGP0cEjcvxrkFjVm8-iOsk7xf5BW--FnPcYZM1Kxk.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-aucEzhzdgjgKUZspIBmyE-pVDqpGePMwnmX90nyn0I0.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-pfRnmDokLNWhPa0Y6ViKnCJXQyESBdYzdNjJUQTwyuQ.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-J68IOU5a9KC-wXgajmF1jnYTq3JRbH7n86x-2wr7ZxY.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-W6PgZnDBeyqKTokPSjnN95v2v1MwtwFB0WygNqr3x-c.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-hiAqumh1xg9BgGNmGR3_WbDF4IeFx1M41fH0OEE8NnY.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-MqJyUYJKWMMEPFij-vZfLhgy3wj3UoroMHeH87lvIaI.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-07FuoKvRQI-9R_fvdE5Hq3xYDA21DWkZQyQvNvDpvRI.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-U8Yvyacjb5mN7dhO4rQBNCW-60R9_haaILrExzWcPX8.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-IF4j2HinKmX8GH6IhSs4s6dA9EFTyBK0aRg_pCIJE9A.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-Qwz0BV9tJpIeBdV1UiqP0fzP8pnHsW7tzUCKVDp4EsQ.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-2BLzCwrfKQe_ArjNTdvx6s9TGOVoaB4W1ZBECMcqrqw.jar
    Sep 28, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.68/9e3d29f05bcfab1c15a1357ebf2dd513c1d42f49/fastjson-1.2.68.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.68-cGrbCezeeBQfDPJGWh6b307ug_n5g8_BYqWhckhy_rs.jar
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 27 files newly uploaded in 0 seconds
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 8286ed8db4b3add32fd3f90473807d16b82dfa103233dabc98624b04022f6ad0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-gobtjbSzrdMv0_kEc4B9Frgt-hAyM9q8mGJLBAIvatA.pb
    Sep 28, 2020 12:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 28, 2020 12:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-28_05_45_43-7419717706125343924?project=apache-beam-testing
    Sep 28, 2020 12:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-28_05_45_43-7419717706125343924
    Sep 28, 2020 12:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-28_05_45_43-7419717706125343924
    Sep 28, 2020 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-28T12:45:43.843Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 28, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.057Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.663Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.701Z: Expanding GroupByKey operations into optimizable parts.
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.730Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.788Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.816Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.838Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:52.859Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:53.338Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:45:53.407Z: Starting 5 workers in us-central1-f...
    Sep 28, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:46:17.609Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 28, 2020 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:46:25.615Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:46:43.649Z: Workers have started successfully.
    Sep 28, 2020 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:46:43.687Z: Workers have started successfully.
    Sep 28, 2020 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:47:12.105Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:47:12.237Z: Cleaning up.
    Sep 28, 2020 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:47:12.311Z: Stopping worker pool...
    Sep 28, 2020 12:48:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:48:12.863Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 28, 2020 12:48:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T12:48:12.903Z: Worker pool stopped.
    Sep 28, 2020 12:48:17 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-28_05_45_43-7419717706125343924 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f8567e64-1e2b-4dab-b4cd-b7cf06f0c9aa and timestamp: 2020-09-28T12:48:17.883000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.487

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 12:48:18 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 47.737 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 1s
107 actionable tasks: 68 executed, 39 from cache

Publishing build scan...
https://gradle.com/s/cc6k575wbb4bk

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1049

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1049/display/redirect>

Changes:


------------------------------------------
[...truncated 271.71 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 28, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 28, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 28, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-_bxHSd15fEEL940JIkB0R_PG9UVduLNAI7rUSi7jON8.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6916825691573026148.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ATK3QZner8TmMVhGeaTawCG7hoP8QnPfXDwcikqhxz8.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-E8Iw-me3MKPVz5odWLPBhtnwYVvIkc7wYmlcCnV-XUY.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-ZYddUcNzp7kMNGIHJrNFGyE-AYqaupuKJPVSfxCkw28.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-GMrgPq07l7EWGc4vDEtNnPOcLd1c0-dIAwd693Ablo8.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-Q8T3Ywk4dXvMTZbOII9GPs0bom0gF76dyBmlelTUjOM.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-afYg-VF8-rSOPiKn4r17Yicfm88qiqH87hypmLLdVV0.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-1daSSqzz_OVUbAr3RVIrxi48wUu1BC760_U6NYrIelg.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-42DhFcnXXs9VQJQuTxDgl7W8uhUp2Z1KTzSZuRoYSuQ.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-fxQhQ6wnqk2zJfihNsvPgA095qdq1n1accJi8LAczdI.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-NxPiyjzSppZF1bWYHQzdzAh4BOM5gFz0BkXnItINZu8.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-vkKj3kXvYG8Xz5ZEkZUoorAv-A2tQmmnNaHocTrmRlU.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-yXcEzd9vwxTDgE1lJm8sQDpqZjolHqKkJ1lieZt8fAw.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-4bspIpGBPbM7dl05FQS8QpSQkjWo9OgRR61zlJ082rQ.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-BXpIL-exVxx1QAUWbrCWOOSlZ2gc_oAeOisY3Dh57gc.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-vkKj3kXvYG8Xz5ZEkZUoorAv-A2tQmmnNaHocTrmRlU.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-9WWeGTxRGQ7c04oGEPG6yV3uC7WMZrQ0yKeKHqGeZYM.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-W462tH2VYXk8XnnrqLZuvy2UyNeGv4YTMIZFcnCBb1k.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-aDqRfbgnzXueFsHvua3rpXM_6rgoDdl4sytt4jZl-_A.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-AW525wwAo4CE5gcToM6Ant7_lT1D3n8FsJXQRapBGe8.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-V7NBd4LZpEYWTe7YlMBnL8D85M2FNedKYXU7VIR0oUs.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-ICWZTiQGqjuQwR0ZG72Pas8gEYE_GcaJsFjyVehUuRw.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-wS5giF61VeQN3IaBgx5_NS5gskenhFcx5cq9gUY_fA4.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-c316MZo_s2pXVIk0MTgiQPO3Jpi8It-BTigZ1BUkQFA.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-1te8WqapAVWkzKTbGW4keQjMpZ15MRH2VWSx7axPpLE.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-m1Cv6DrWOWx56nOwpi_V0Q4IbUaDW217Ur68suNTgH0.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-wcgQsvO8z2OyhmRlbviOh0THlX3IZD5EErja_7IJDog.jar
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 28, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 28, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 28, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 28, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 28, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 28, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash 1cbf7c04fa10e297fd286e5ca42a3318e18f96f18514cfdb7ef2f670c752d706> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-HL98BPoQ4pf9KG5cpCozGOGPlvGFFM_bfvL2cMdS1wY.pb
    Sep 28, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 28, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-27_23_45_18-17552227632398297292?project=apache-beam-testing
    Sep 28, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-27_23_45_18-17552227632398297292
    Sep 28, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-27_23_45_18-17552227632398297292
    Sep 28, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-28T06:45:18.525Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 28, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:26.893Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 28, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:27.685Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 28, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:27.733Z: Expanding GroupByKey operations into optimizable parts.
    Sep 28, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:27.770Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 28, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:27.850Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 28, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:27.877Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 28, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:27.912Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 28, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:27.939Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 28, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:28.299Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:28.388Z: Starting 5 workers in us-central1-f...
    Sep 28, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:39.717Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 28, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:57.653Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:45:57.698Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 28, 2020 6:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:03.011Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 6:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:03.042Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 28, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:21.921Z: Workers have started successfully.
    Sep 28, 2020 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:21.948Z: Workers have started successfully.
    Sep 28, 2020 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:24.353Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:58.263Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:58.476Z: Cleaning up.
    Sep 28, 2020 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:46:58.541Z: Stopping worker pool...
    Sep 28, 2020 6:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:47:51.628Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 28, 2020 6:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T06:47:51.681Z: Worker pool stopped.
    Sep 28, 2020 6:47:57 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-27_23_45_18-17552227632398297292 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 642fe60c-3310-43ce-a1b0-c85d54894ce7 and timestamp: 2020-09-28T06:47:57.102000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    17.116

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 6:47:57 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 51.511 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 40s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/qtwauf3rgqpas

Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing files not accessed on or after Mon Sep 21 06:44:21 UTC 2020.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 0.192 secs.
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1048

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1048/display/redirect>

Changes:


------------------------------------------
[...truncated 270.95 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 28, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 28, 2020 12:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 28, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-X9zsSq6L4jaSwzp0_JQgIXVLeA5Ao4ArV1NeSNF5OI0.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-4CNd0_Z09F-ua4Zu2pWdHiOenOilMIlJ-SKMvfI1ehw.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-ULFt5jU_LtBajqcBCuhtDch9jB6x-rAdg8Xpti8Kyvg.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-ugD73aS8u0jI-PVciT76TIyrQrBKaUpwPuD1uHPDd_0.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-HZ6O6QJEk-boNyijs_TF0XhY4xH6iE2sj7HeGQ7K_n0.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-UKLr2BjT56D8xAWOHpz2v7PAi4NLqloetiCmlBkuHl4.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2553409351212571711.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-yIn0WmE8ssNPVN4FK6iBhNIrKtO7Nluhj96O_80u45M.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-YRDAA3dBEYu85-vei-Eo0qocLKAXoS3dhTTwp3hBhOI.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-hn9V-U3blw1XcmtJI98A8pAblD_av4K0hJelGJyrOKQ.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-qj3Pt3Bv1jIbNrOS7YxnM8JvQoCqlzESsbfG8b43aPQ.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-X9zsSq6L4jaSwzp0_JQgIXVLeA5Ao4ArV1NeSNF5OI0.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-EblEc17xHi4-GQEM5oHWEiwVRB04_HIFl0h0-XcMrzw.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-xoDmHC5v2tBB2kH5UuBkAoU_VWdYxlH7ZjJfzGdUO08.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-txXeUrLz8-HNqDLFn8X5vUqJQnB9Sdkm_XJeFEBK0xY.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-Ae60p5bby_GT_949U5NkcdqZbWBmWxrdLNIRwHXJoA0.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-lrICINdkI7WpAoLQAYBJJbQzvmHAnXsrnJe_i8o1ZNs.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-gnfQM3XgiD5ix0NRf4jtKnl0JFVOnobXnTcnUQ-ZfLk.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-Ng1oP3RST5hxRIn-PpnRb9Txwr1ralLbVA4w8EHVIxo.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-ZDsXkhReuXFcJ7RgCZ9wDp8y4HdNe1ah877yTUwqiYo.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-HBjkiVg_kTDaGaacZpT2yU01IW6Cf6b8VBlhOZpvDuM.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-sDw2681SBtyftX5d39mAChFT7m1C2tCgjGtpUPZJ8wk.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-ECUdTJFpH5peOQWopsrgu5jxZSZmtlG1_wb1yJCANbY.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-CrnDvnDjm_ttpvJr5vJt-6m35cxPQxbN4ef6gr0vfJg.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-klMWnN9IaoyrQXWISyms7NU3yWNW9Rs1R6V_Fz-4Ww4.jar
    Sep 28, 2020 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-V0a3O7jlnm8gnRU3m1dmNiF-669MOPYxfV_8ZLVGf5E.jar
    Sep 28, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-iOLTiVprjsSi1BHnMyIH28S79sqcW1P_pWCaY72miFY.jar
    Sep 28, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-CiOixFjGtXclGJHfC-CsqhSzn0WDah78Tl7mi0FGijg.jar
    Sep 28, 2020 12:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 28, 2020 12:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 28, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 28, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 28, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 28, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 28, 2020 12:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash a62f5915a6189a98d8f92165b6f73762744ad91eb6738a7eaf5a27b7cbd4d3ad> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-pi9ZFaYYmpjY-SFltvc3YnRK2R62c4p-r1ont8vU060.pb
    Sep 28, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 28, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-27_17_45_22-7149177217992534716?project=apache-beam-testing
    Sep 28, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-27_17_45_22-7149177217992534716
    Sep 28, 2020 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-27_17_45_22-7149177217992534716
    Sep 28, 2020 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-28T00:45:22.453Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 28, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:30.178Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 28, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:30.907Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 28, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:30.970Z: Expanding GroupByKey operations into optimizable parts.
    Sep 28, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:31.007Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 28, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:31.081Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 28, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:31.111Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 28, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:31.137Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 28, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:31.158Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 28, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:31.607Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:31.665Z: Starting 5 workers in us-central1-f...
    Sep 28, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:36.609Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 28, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:57.593Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:45:57.620Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 28, 2020 12:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:46:02.892Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 28, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:46:15.788Z: Workers have started successfully.
    Sep 28, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:46:15.824Z: Workers have started successfully.
    Sep 28, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:46:49.659Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 28, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:46:49.805Z: Cleaning up.
    Sep 28, 2020 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:46:49.880Z: Stopping worker pool...
    Sep 28, 2020 12:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:47:32.233Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 28, 2020 12:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-28T00:47:32.274Z: Worker pool stopped.
    Sep 28, 2020 12:47:39 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-27_17_45_22-7149177217992534716 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ce22a18e-1052-4b0a-9094-4bac16978518 and timestamp: 2020-09-28T00:47:39.307000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.721

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2020 12:47:39 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.057 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 31.226 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 22s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/7222l5qrrllzg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1047

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1047/display/redirect>

Changes:


------------------------------------------
[...truncated 271.07 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 27, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 27, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 27, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-yFhG7gtAU3n6z9C7-ZjQ0wayWESg37lQz3ZtuI2O1gE.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Svhg5zgjmTT78gawSeEj1s-nLNTYxPy2_nq9iWgqTbI.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-1SlBceqzYlSmJl9z-rnCTaYPQaPcs0c37nHCp7OJTQA.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-0XmhMPGPY8gjJlcMscrSudxh-IiW7T1Iz4dDLD8bhUo.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-aUzrAbr4JKSjmNZ1LxRrpTrXdgqmhdPPy4Li50nHfs0.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-aD8FuranYXmBKfEUNbzfHRKMWUEBh4JhwtCM9a1aae4.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-xV12WbqeVEkum18AE1j3WOAmtSgF3BoBVpnQKu29rq0.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-if1eDKeABKpDyWdC5LTmTrWCxCnqG2zqTu1JqJh8a-Y.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-WH4kJQibmfbnLTgj74Cgx15uTc1lJQJ2fWuOn80sMKw.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-lc4znn2cfaX9kwxJcSpbStv6ePNrc5N1T-2eTbe9OAA.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-jTXV2CA1yrPfykNklABnIQojx-cFPZt92e4Uf1rm-6s.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-I2bp1G0J02AjGhMnujEICnM3fkuUS0i_AKnmTdD2knQ.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-kJc61cyaEQPkfs5uCYleEFxncH3pf2fru0ltJ0ccu2Y.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2107590513650399029.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ijOcYzRamsK9q6he9LO_4rOr5ip-1OeSvRM4rrSGUiU.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-RDbLaVz2-5raeLho8ETwpa9LBYifQ_jGrq07vPXpgZk.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-D2iO_pNIUsqkmH18knkHfhOAM6Fgmd6bXHHri0qPqIQ.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-g9jxBtRFoKym9Lhab6pfFyPeZLSRRAO36Cp3kJjKH20.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-4_t9g62ZIkVcH-2gJwQ0I4Al-XlCyQAK2RMGjddBgWc.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-Gqj4LsEC6CqKwnrYRkBR1LADYlbFugBers_9dYqzQG8.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-4KgP9uyLOp6fnWUDVXaSIijwYUcpbUXBTWYxl80SeH0.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-VXdIKt12g0mIWwfcM1SpXN1WPBeWNatu_2QBv62Xrcw.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-1SlBceqzYlSmJl9z-rnCTaYPQaPcs0c37nHCp7OJTQA.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-jtMTWPMerP3pkLGeGIW_kpu_r1M-JByLxXTMsbMYgOI.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-kSLD2c0PnE7r1Jo5ssXEyjXZw2bYpkrXlqk3qGmxzYs.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-H8HDKfg3KRmN1qaQqJNnYIUyigFdarYOIscWTYLM7BE.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Fhgw4OxKPImwKn6tAkLjYrkowMQYHY8sHE-_uRlSpR0.jar
    Sep 27, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-xz_NLQ10hEBttMY6eL-EVAC1cjm-yyjM45xIbG1KOFI.jar
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash b00424351d783dab1ac50c2c78b5145b560929f300593fb03f3f7f9bbbd8a509> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-sAQkNR14PasaxQwseLUUW1YJKfMAWT-wPz9_m7vYpQk.pb
    Sep 27, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 27, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-27_11_45_19-3032873533213885387?project=apache-beam-testing
    Sep 27, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-27_11_45_19-3032873533213885387
    Sep 27, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-27_11_45_19-3032873533213885387
    Sep 27, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-27T18:45:19.639Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:28.159Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.299Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.340Z: Expanding GroupByKey operations into optimizable parts.
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.368Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.424Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.464Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.500Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.523Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 27, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:29.996Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:30.074Z: Starting 5 workers in us-central1-b...
    Sep 27, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:51.453Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 27, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:45:57.584Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 27, 2020 6:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:46:16.148Z: Workers have started successfully.
    Sep 27, 2020 6:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:46:16.185Z: Workers have started successfully.
    Sep 27, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:46:47.391Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:46:47.537Z: Cleaning up.
    Sep 27, 2020 6:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:46:47.622Z: Stopping worker pool...
    Sep 27, 2020 6:47:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:47:37.298Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 27, 2020 6:47:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T18:47:37.348Z: Worker pool stopped.
    Sep 27, 2020 6:47:44 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-27_11_45_19-3032873533213885387 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b1c49999-b53b-4675-91fd-23375b6e39b4 and timestamp: 2020-09-27T18:47:45.005000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.342

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 6:47:45 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 38.518 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 28s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/k23arrkid3nsu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1046

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1046/display/redirect>

Changes:


------------------------------------------
[...truncated 270.77 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 27, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 27, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 27, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-0Oa3kLLkZl-SNg_eUkmwErkQbd9fm7_vTbVYSF4CHeI.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-RoeBP9T4e7uC2HS_Iopc_zFjA-3Z5IQZR97YPOolW4E.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-WzAp1x4on3eBzxkGq4ewl9j6YbKGIgoROPJvsvqf4Do.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-EOl2uVLiPDB_sN6RNEjfR28LU99WjUgc0jTyCzVjsUg.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sGCZIE70PyTHvvFpx_8Zc2q2ee2mpEahykRiYUMn6Bo.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-qpetJ6EzRdpzsZxmWlxQBtOt5LScxppRpF6LjUNc1NM.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-6QPEKnxwqnM5182JXL3hLI_pm1eGmp2x7QvHeQEP23c.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-jQum4yiy7uePfoGWyhkUHnQnToc0_ZXr6vs0UeURNtM.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-UJ0TVdVA-A80hKEYpIBRxuLdxHo_jFV0aYQGif6wTUw.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-sGCZIE70PyTHvvFpx_8Zc2q2ee2mpEahykRiYUMn6Bo.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7341994116803359113.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-cl0eD3J6Kb4R9p9aJ1-22OJNfna7s6bXaZdnivinNpE.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-IENsDlGbRsfRc2NTeo0var-iVFzE0e_BCDh9Pn2xnEg.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-XgZtCKw8auZKVmDlD_wSIEmA0YSspBLxRIr9me4cNBo.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-PQhEETNvb2f909PNEe_a53IxYDIaDsa2FJvDiKavxFQ.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-uyr13XO0gRPgK2-fXxyC60cMCtRhNbvGleA6dMEFu6Q.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-LFNSmgRBFrh9rYMMjx4I7ma0KIXUBAfMzX7NpboSOj8.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-qckjG2LATV0dBLNVhPOpuYNjEFwY3QAdy_xzaw2S1sY.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-aFTahuwuOyVZTdg8tm5gjtXV9DMQKPqsXaS9BMeFCCE.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-guAUKMpeUoP7DDvGYAhhTI69v7IHaw6SC8qwEV6ztBQ.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-29vp8A-cNpPthLx_YMK1pmKEbWnjIejYye4WTNJT6EE.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-qUf1JmSvHh_vKxKowYZRN5Nltx0x1b7U5XCbWUItpzg.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-zQEXJcawgHXkpbcgHRlGtowU5vT7-YdQBex-fcOQH8I.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-vvT-tt_qMEVB5rCAPBUvdcE8VS-Dp9YNe2gCWENaRgU.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-sUhDCwbr4VP4WUrRph98emvSLSBr0Z1HwXny-BJr9Wk.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-Jcto0iKeTfb-5mLYPiJko9AH53HPpNrywC16ElF1Q_c.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-1z_Q7yltJilAvS8iPOZ7LX-8X4n9xKVD9z4cUSEHitk.jar
    Sep 27, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-DEp5jWpkb0u9bOxOqgrkEf6eeTamcQ8lKf18PVU9qu4.jar
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 16aacf788c582adae916a986248c95796f14e15822826df10471c0dd5002d6ed> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-FqrPeIxYKtrpFqmGJIyVeW8U4Vgigm3xBHHA3VAC1u0.pb
    Sep 27, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 27, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-27_05_45_21-14364124214925366901?project=apache-beam-testing
    Sep 27, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-27_05_45_21-14364124214925366901
    Sep 27, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-27_05_45_21-14364124214925366901
    Sep 27, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-27T12:45:21.901Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 27, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:30.216Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.291Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.328Z: Expanding GroupByKey operations into optimizable parts.
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.366Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.478Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.515Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.545Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.583Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:31.933Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:32.008Z: Starting 5 workers in us-central1-f...
    Sep 27, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:41.161Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 27, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:45:57.444Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 27, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:46:14.850Z: Workers have started successfully.
    Sep 27, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:46:14.887Z: Workers have started successfully.
    Sep 27, 2020 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:46:50.647Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:46:50.777Z: Cleaning up.
    Sep 27, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:46:50.848Z: Stopping worker pool...
    Sep 27, 2020 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:47:40.892Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 27, 2020 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T12:47:40.931Z: Worker pool stopped.
    Sep 27, 2020 12:47:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-27_05_45_21-14364124214925366901 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5bcc397a-ba91-4c80-9394-a501c3a0301f and timestamp: 2020-09-27T12:47:49.220000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    17.731

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 12:47:49 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 40.357 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 32s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/iz4gvbisr3gsi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1045

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1045/display/redirect>

Changes:


------------------------------------------
[...truncated 272.47 KB...]
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 27, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 27, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT--KNaarnxrpPW3qQgdOXJXcxNDXXI3NyIFDBI62aF9Ro.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-u-wyZ5ecK-yuiJk42w0a4wDo0mi-ywAf3sUxGaGhQcc.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Ph-0R-WxK9wvR7bGw1Wwy2iTVeDvtCXjHEpohbjOHK8.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-3Q9kwim4H_E01heNHLQZvgL4U0mEe4bLLmdxW6mrxLs.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-TsT6ExhdkMf7QPJ9fHPEElG0wbk-no02zeCf5MDvY98.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-UE4d2EuQTCVng551SgwMwP6-brDWQd-8LrFlE39l3dI.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-c8M_fquw6vLPj93HphKoVEguHl1eqTCYzEZ1JZLraIs.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests--tJK7RyO4iw1_cv5zomzdpZJPRFmLzWtnYKmwVRRFNM.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-QJQUvv5cnTNvQEfhPLtu3RRQ7KUqTU5AxwxkfPmOvVM.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-J6SlfskHH7kJTwRq2S8SHK0zHkf9QGei1GPBUwNrmvk.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-V5Va1LBFToIoM4OuVGIOOk9d7RqyihRbKsgKNWjgzXU.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-wezn-1eanifmxnQozwgyTb9g0C3QMoQCF05XwqftTgI.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-xJ-sTivd6JxWHjA_wIhJWyBZYaYqKTwMoj7aNR786Pk.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-rZRmU8hqyUCjLv-HyorH93ZbQMAnqon1G8C6Kve1FG4.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT--KNaarnxrpPW3qQgdOXJXcxNDXXI3NyIFDBI62aF9Ro.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-GiLX55nsEiJdlKa32qCVYH0pDHlr26z4Ay19vmWN8m0.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-VO2rmgR0gNrUC7U6lEdGrXxVlXkrllrG7JEXD0Xmrhk.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-EkARUIUC2bMj8q1neKC51Pd1Ax1XJpmLl9bt8CtohcA.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-taZbWgQUih13ihedyOnHlJTa0YZCCGkPJqK9poaDKUU.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6755071685026074838.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-TdoqANYs9oSLd-TfFZ1CVDk46U0nV6iTX_sUj5Whrzg.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-SuAUip96Vm0vKcNgZsZEBDYJ2ZbFQuJTt-oPQHwStuY.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-4NuOT76H6uhzcA-EwU6lg9Y4dOI3fIS0tQfd0Ka7uNw.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-RTKMuyKLfjtvdxtmSIl343IhPfmGHSFDSPR0Sp1lJnU.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-6sUxkRzZb0D43oQ0XWIagCHyX2Bwygd75EGSB_Zq794.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-CMgTfe-wz-GmWGLhYjPoJu-UqB5UhcXTG-_xuSQ-ilk.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-6eSyyB5mkN2Ll_mEGTL_OK7nxS28_T8CpcYyyDar1QI.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-GF8owdSpfh6qPcKjdfvk_VqLj5BlZxBHObLsyeUaSLA.jar
    Sep 27, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 27, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 27, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 27, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 27, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 27, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 27, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 7048a731173408deb4551df98b77bdc1c8fb7baffbb3cf49aff76dc841171a8e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cEinMRc0CN60VR35i3e9wcj7e6_7s89Jr_dtyEEXGo4.pb
    Sep 27, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 27, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-26_23_45_22-8388666750052355909?project=apache-beam-testing
    Sep 27, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-26_23_45_22-8388666750052355909
    Sep 27, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-26_23_45_22-8388666750052355909
    Sep 27, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-27T06:45:22.584Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:30.093Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:30.980Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:31.019Z: Expanding GroupByKey operations into optimizable parts.
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:31.043Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:31.121Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:31.157Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:31.205Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 27, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:31.241Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:31.981Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:32.064Z: Starting 5 workers in us-central1-f...
    Sep 27, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:45:55.770Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 27, 2020 6:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:46:05.675Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 27, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:46:16.641Z: Workers have started successfully.
    Sep 27, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:46:16.672Z: Workers have started successfully.
    Sep 27, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:46:49.137Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:46:49.271Z: Cleaning up.
    Sep 27, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:46:49.403Z: Stopping worker pool...
    Sep 27, 2020 6:47:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:47:39.673Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 27, 2020 6:47:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T06:47:39.713Z: Worker pool stopped.
    Sep 27, 2020 6:47:45 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-26_23_45_22-8388666750052355909 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 98eef766-b6ec-4d3b-8b96-c43eb0d999f0 and timestamp: 2020-09-27T06:47:45.268000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.385

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 6:47:45 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 37.909 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 28s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/2amlcia2q2enw

Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing files not accessed on or after Sun Sep 20 06:44:21 UTC 2020.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 0.199 secs.
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1044

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1044/display/redirect>

Changes:


------------------------------------------
[...truncated 274.08 KB...]
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 27, 2020 12:45:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 12:45:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2020 12:45:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2020 12:45:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 27, 2020 12:45:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 27, 2020 12:45:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-6sUvo-j2ZEuTKo7ezNDpccgn8LX5D7XkHHOU1IiR03o.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5867853826443468948.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-BicsVDn_04K9vOyjbDTVX9oRhyYKO5XMXHMv0FKk-bc.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-R4ElzhK-SzP7Ov6WKhzjeq4_WrodWJFY979ikB9_AaA.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-5pZEJ18ejIdITf52Zsj9GOwTNymJo5ia-MOQvkj4dXU.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT--b5E7SjUjz4MegFIeUMcg-lipBIva32XOWr_bOEaqsI.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests--10Eg3njy1iYNzLgTgjvz3Lq68Ba9Nc4APd1KSFjZ4Y.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-ZH5soDgmXoys1eMruh4gBSr89hWrnB6VvHEXpAgYxEI.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-jy_PANLdHZMTsdcAlcGqQwLb0czGeSgX9HcOS1M3nXI.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-9yjfxihPsKh8ACNonugS-VhPUPCYgdxE9TSOre-DGrU.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tJvF-LMtO9Wk00SWEnvCVuVbKdHI73Ietd57J2ep7hg.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-3D0MXUgZgSogB7beiNPlRLk2uKY7odPmAKjlWNlrZZ4.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-uWZUZnu1SYy2XIxDip5d0GDdMQjbPyp6DS3k9ctqZZ0.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-bMHB7APRRzZRnvvP6ugF4T_BDDWn1oxeZ2lp0UNXm7k.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-rRKAY51PkHcPgF-fezRXjhKHFwYMBULm1G3DyCJWYVI.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-U_aL5a9pcLxwBxCREhg6dHkIraZJd5OxEJ7qw-KLY7I.jar
    Sep 27, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-6sb9b4hsn52VN2ku1LjeVwVoOv2HS4r2A52yXhu3mOM.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-2eriKnXDWmH021xzL_o3MTeo7SzF9sDVYgW9eoMLAB0.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-6sUvo-j2ZEuTKo7ezNDpccgn8LX5D7XkHHOU1IiR03o.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-8nrbO1TWH_-4PBPrRpAU7W7MJkAyKBrwj2PobzWp7rY.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-03fFfPG37QNmODmUK-989fcutqrqvVcpHeeWIN_Bbgk.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-NRTiM-AQicTX1GFzjw2AZnUPotGPdXISdsj-2zZFAIU.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-lYI5wZga1PbYnI5V8PKH1AP5bzHSvFc5TDU4OBK5G5E.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-xF8sgxNG7vx8yDOQGibI9l1Uepoau0ZFJz04uBQ4OpI.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-YVQr98Sp419mwkaHB966DOF8C4SHWD519_GEuH76Gc0.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-RTYTJREgxRDCOZDLMpe2X4diQJlfHyuySrb4Oi80P8U.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-oB0ypzqttOtavk_Y2zFqDzxiUuebtIko6AUxrbWYZow.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-SLGixNvFZ5_6qogVwLmt9JuyI_f2SePGwVUfzguJkwI.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-u1aY9Q8GnCu6UiVDe_SsKlnSZ5__GeGhaEZP0RU-i0A.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-ndWIL-tv0PGXGQ4nRkLnmMFKjIV0BsV2ABdlOWhq5oU.jar
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 192 files cached, 28 files newly uploaded in 0 seconds
    Sep 27, 2020 12:46:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 27, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 27, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 27, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 27, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 27, 2020 12:46:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 30b20c6f02991b5fb08178174be1fefa833d547df0934ee6e2fc25471e52a4eb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-MLIMbwKZG1-wgXgXS-H--oM9VH3wk07m4vwlRx5SpOs.pb
    Sep 27, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 27, 2020 12:46:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-26_17_46_02-554644357181342224?project=apache-beam-testing
    Sep 27, 2020 12:46:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-26_17_46_02-554644357181342224
    Sep 27, 2020 12:46:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-26_17_46_02-554644357181342224
    Sep 27, 2020 12:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-27T00:46:02.481Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:10.652Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.210Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.247Z: Expanding GroupByKey operations into optimizable parts.
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.269Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.327Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.364Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.401Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.428Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 27, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.762Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 12:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:12.847Z: Starting 5 workers in us-central1-f...
    Sep 27, 2020 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:35.038Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 27, 2020 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:35.613Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 27, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:56.155Z: Workers have started successfully.
    Sep 27, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:46:56.179Z: Workers have started successfully.
    Sep 27, 2020 12:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:47:32.185Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 27, 2020 12:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:47:32.312Z: Cleaning up.
    Sep 27, 2020 12:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:47:32.408Z: Stopping worker pool...
    Sep 27, 2020 12:48:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:48:21.393Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 27, 2020 12:48:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-27T00:48:21.434Z: Worker pool stopped.
    Sep 27, 2020 12:48:27 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-26_17_46_02-554644357181342224 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5edf6410-6ef9-4340-b0d6-285ee1d0a118 and timestamp: 2020-09-27T00:48:27.717000000Z:
                     Metric:                    Value:
                   read_time                    16.875
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2020 12:48:28 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 40.607 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 10s
107 actionable tasks: 66 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/jbcnkenilccne

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1043

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1043/display/redirect>

Changes:


------------------------------------------
[...truncated 271.37 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 26, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 26, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gEMvbcJ0QFPHhI3nM31ZexS5ZXPlAsVGMiWAasdl5XA.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-0QQYYdSQ4Muj0utkBGIhfBraRWXN1KZko_Ja4O-R-_s.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-__TY8YKKu-dcZQZQJPmUJ_hOisz9uqlcHGYEMk0_S2A.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-MuIDUK5E4fkigHSRSOfdnSn7dw6VcNfB1fItsoaU_VI.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-jfDAqb_UUo-hbJ6XcYVU3Wgi0wFxC5akO_KRU0XhICY.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-7om7bRBojdJfbwtp9Wcdy_olkVjxBEp7BnbYVbaUHgw.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-BV9zLqyQr2QyQwM5MlsqZ54XU8ZAJkBUVDRozD6tt84.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Tae6FerTreP05Bl18T8fqeJWZfnoi513UpO0Kug8gt8.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-YRRFd9VokoiwHqqqDwLHDHqVZBRRC7nWy6XsSjppAsM.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-uvH8mM2QcM7-LgU2Tq5pvdgX3qG9fEJTXgTfDIS8YDA.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT--1-d3Hdw9hafLOYK6B9sFtbqPyRS17PegCB-v8upNrI.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-x0Qp1MUOT8DjHjZiWOdTJ4CP0JEa-GvbskuSx2EBekg.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-69Z30Qj8UhluJJsGQdjm0v5ok8bFZbMqoEAcwFNg_pA.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-M0xQvH47rmQepqEDNpiraShymVetldqmaEyrXgM08dg.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-Zl2rejjC4Cp-zgOsebq2J7ZT8F70sZGHgCe5xYuluh8.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-blRMSQ37fItVK5TtivnzdXXXcP1s5n-0Tsd4cU8Sa-Y.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-OIS-asTowK8MGA1puoHEsjK6anS9g_yhQxwvLS0bYJo.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-UJYMj-dy7rZY9ISNqHiCQqh8hAFzHypgH2Uqub9g6iM.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-1YTSaGg2lWpQ8xjOQdgXOx8IVOV2qSwheZdweeeT64Y.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-05AVyeprOxTQxodQxN3zf_LkCgPfnOu3Goel6MQk_X8.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-4ULsgmt-sM9XfrFrq_65PR3rUBp9fUT98gHdHbDG_rg.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test977875011780437055.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-tz48bLChALScWT8OI2tP528332m2QcIKPrvKOrPVop4.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gEMvbcJ0QFPHhI3nM31ZexS5ZXPlAsVGMiWAasdl5XA.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-dEb_Ulzke4-knAJ2sPOPINWT7mhBy_OEkZ2mYLiYD7k.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-LPLPfB7PM__4QKQsyEIsRIH-kCG3k5GcKq2PV7RwpFE.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-fZfGL7Hpr551Ux7Srp-eg0rHjrNJS7K-BRTe73S_VRA.jar
    Sep 26, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-4hSoktn6-SR5FjwzikrMEkupHLAY0vqdvaYVnzRKRtg.jar
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95915 bytes, hash 40edb86630ac4047c0bfca851713dde1521441aeae01a8a17073b77321ed7f92> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-QO24ZjCsQEfAv8qFFxPd4VIUQa6uAaihcHO3cyHtf5I.pb
    Sep 26, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 26, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-26_11_45_27-1515094053020343100?project=apache-beam-testing
    Sep 26, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-26_11_45_27-1515094053020343100
    Sep 26, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-26_11_45_27-1515094053020343100
    Sep 26, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-26T18:45:27.043Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:35.320Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:35.867Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:35.907Z: Expanding GroupByKey operations into optimizable parts.
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:35.995Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:36.069Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:36.100Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:36.127Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:36.155Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:36.483Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:36.559Z: Starting 5 workers in us-central1-b...
    Sep 26, 2020 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:45:46.132Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 26, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:46:04.940Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 26, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:46:22.462Z: Workers have started successfully.
    Sep 26, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:46:22.541Z: Workers have started successfully.
    Sep 26, 2020 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:46:51.416Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:46:51.571Z: Cleaning up.
    Sep 26, 2020 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:46:51.644Z: Stopping worker pool...
    Sep 26, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:47:43.311Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 26, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T18:47:43.350Z: Worker pool stopped.
    Sep 26, 2020 6:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-26_11_45_27-1515094053020343100 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 177a3dc8-496a-481e-9236-9cac779c420d and timestamp: 2020-09-26T18:47:48.924000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.161

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 6:47:49 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 37.853 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 31s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/pcswymruuv6mu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1042

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1042/display/redirect>

Changes:


------------------------------------------
[...truncated 270.44 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 26, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 26, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-cu2eBEyDYWM7FmtP780480JQEAUsLOYohS6SIYGdWok.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-7p_dd5XQi3MMfi1wBkLRhBMXiyS9PIi5QRznE_m6cGQ.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-ippPn-cs36hH84gDTT14SHILx5ATvPjH-i2CVCppWiE.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-CHAJ2FN4eV9VNh25T1JfiaAz92DP5Z5bweNGzaW-Z_c.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-cWjs6trXdh6roW2ihciNraDke1QSa9nr5PijqHopFKc.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-CjM-vmYL7OoBCnCbho9deRYuQD22iHzXb2PYehNdM-g.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-SJHQB-G9pS0RjkefpQk3EYofi0urYtPZJ305R_ilDHw.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-YgiWAJEIDl0XlPzFoNsLrPkIHfylcUP7nVZ0PMpAAIM.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-qxaGzfM9_FHH_I0YJOexKmvUN2aR8HBYBOhCt2Yapfg.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-612QYmYaQbayH9cJkjSsURyFf4cuLcI90TNqiR9Hl8c.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-LaUHsu-7QtJ629JZAgYodA5iFFtlOZBOVzU8K_Xd6TQ.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-vDPCH7wBzYpvZhPAlQhRFiMIh57YzXnPk4Yq7FiF2Cw.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-3GJ5-I3TucpdoQyfGzTyCgxM08Hzcoqnsqwx42Q6psg.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-ngeq9is9kRcr8fHZrj9_FryOHdYwYcgtqkPYSX1GiO4.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-ysWRD53WfHzVtMgtbF7aaa92FM7Y3ZVdEYtfDQf2KDk.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-_ob41VWHifZYkCYZ4kQkZxM1e-jLbbuj9JbVLwXnaxg.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-cu2eBEyDYWM7FmtP780480JQEAUsLOYohS6SIYGdWok.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7564404547239429244.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-aImjjSo_ZIzkO9FjUV1mhJAbUOy7eTwHT8jYwbT6qs0.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-SrW4AIZ8_BVKaqqtvm8vBC57KRb_pAQCmuVWaOhYq_c.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-lz4mwZW93F_5rqB9i2Ntt0bvH7YYfbh2jyXMdL4TEH8.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-I26BrKYwv_4JfM9hXX-JrM4pQXGtGB9WPBhehSvhCmE.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-ojEchKmeJFU0_v2amaX6O5IwyzGiq353rR3Sc4fp7mo.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-VXKY62yGxvurmaAJ42a9Aavlu8VfKuwfLkhuW1056EY.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-PgmNDa2UmDk7miACExvhL9RLH_2rs1332v6fJ3QFwTc.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-VtyD1Cy1r5hV4_VdYKEPoByf82SZhcmt1o3QdkfHnw0.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-EnrXuhTo_ly_LVGodA9xV-UTrWicyazKzHYWv2BAE5o.jar
    Sep 26, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-up7ZYbanGIQtooxEr6TS-B8omfo3kljF1sOQpVu0Emk.jar
    Sep 26, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 26, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 26, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 26, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 26, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 26, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 26, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash 7be597d9a0ccebb3e283be269d90b3d986027af5e0c0fd5bd8e4d3b3833839fd> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-e-WX2aDM67Pig74mnZCz2YYCevXgwP1b2OTTs4M4Of0.pb
    Sep 26, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 26, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-26_05_45_23-17277875205666871586?project=apache-beam-testing
    Sep 26, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-26_05_45_23-17277875205666871586
    Sep 26, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-26_05_45_23-17277875205666871586
    Sep 26, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-26T12:45:23.147Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:33.799Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:34.442Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:34.474Z: Expanding GroupByKey operations into optimizable parts.
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:34.502Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:34.580Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:34.607Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:34.631Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 26, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:34.663Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 26, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:35.010Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:45:35.082Z: Starting 5 workers in us-central1-f...
    Sep 26, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:46:00.881Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 26, 2020 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:46:08.841Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 26, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:46:20.173Z: Workers have started successfully.
    Sep 26, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:46:20.205Z: Workers have started successfully.
    Sep 26, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:46:52.961Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:46:53.122Z: Cleaning up.
    Sep 26, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:46:53.195Z: Stopping worker pool...
    Sep 26, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:47:40.371Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 26, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T12:47:40.418Z: Worker pool stopped.
    Sep 26, 2020 12:47:46 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-26_05_45_23-17277875205666871586 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ad6b38e9-196f-49b4-81b8-a21ab002f810 and timestamp: 2020-09-26T12:47:46.290000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.239

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 12:47:46 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 37.282 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 30s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/gtmuwe2pz4o2o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1041

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1041/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10959] Go SDK: Store a fixed amount of known process bundle


------------------------------------------
[...truncated 273.48 KB...]
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 26, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 26, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-zQlyVgiPS315qD7r7PXUvLxFIqUHgN5lmDufmsJNTwc.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-mnAPMOzD_Z-jrPJEoeERgXL-Q0v2yd3MNLyCXODcAIk.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8785051967837179816.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-KRnvjdIkmFeXXm7i21CYxd6buXK7YZ0Mr8Fq0TNkBB0.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-OaclGtPsdzjswp2y8uw8FRy9TEPzQ_y_nwUzEhVqbD4.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-g6Qg9nFf8PPjpdAMTQcfufl5fz82yVeAOVaYyMbfOv4.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-kzeSJgYDVN8rDbgNcDLkVXCbLmEKUPcX6JchOU4DVmc.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-CSj21lD_PnxLMh0BoyLruEamq81h7iA0MHeKB2vDcNM.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-0-sPSHP0c6qXx5dkRUQ1T-06raTMvVJSy-HOUGEEMFg.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-pxuBY66bOB5_pKRYFqL22cY8qlldZAgSbe_EirYxys4.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tXKhU52N2QRX2Pb8Zx5hf7_qAbpYb5FK0CBwm-dg__M.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-zQlyVgiPS315qD7r7PXUvLxFIqUHgN5lmDufmsJNTwc.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-2ddHc19DIgUlb-wubMMeD1J-GxGHSUMicCLHUBkKSpo.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-niysu5pyZ95eP-9jao6MARxEjem5OiU91ZYD5zmOjrU.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-FM6w6qdONj-rkTfXvS8u2VTipPLUad44avOljjhET4s.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-dfM3nNIJegL_1qrZXDc16qqmie6uugewdPrd0Bcbtqs.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests--xeFaOZ-2sbD4k8qwD2yVtMQOzjGmysMhyrDqY3g7q4.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-bErCNTaZuzTlFKFQleeerFe_Kz48FArmyAIIXn9KvKw.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-JfvjoS24S8PCLdO-89vRVsmh71dGP8R58HIeTYwULHk.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-7UzWALl7niGEltxp0CpjroVyNLMv9WkW8N8DjjgSdeQ.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-2QeoLal9f2G_i06WcsPTJEmjMDYgNDQUjXzJTbI3NSs.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-Y2CC-ERBAHyDQoRum1gnInxjdCu6piy1n9hhMxvOi4I.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-ofM1hZ-Ag2KHrWopeZTkT7-Ii7FCkhnvkCDq0UxyXAk.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-x3P8zsArvgJuL5qtN5J9E7870MLjoTpSUxz0OkgXZu8.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-VpHL6Fcs-noLeET4JrkbJMx60fpJ8YqwV0WIEb8eVA4.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-tvGmWVwnJ1URMkR2duwgXB4fxVtLZbLOmlPGE0xUQtc.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-6-jj58To92qnIyVJ7KNRs5bKlIlxdIxsudAMJVfU2kw.jar
    Sep 26, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-5_J2Ul-7w7vsMJE8fGFFyhO_yRyT63f8n1z-KEcf_uQ.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/kafka/1.14.3/b90885e30e86eb454e7b0e8e580cf59616e9de39/kafka-1.14.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/kafka-1.14.3-ITKoa6D-lUtLem0itUO_R5B5PqtFpvvOjJf9bTByVUk.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/testcontainers/1.14.3/71fc82ba663f469447a19434e7db90f3a872753/testcontainers-1.14.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/testcontainers-1.14.3-pumfUOVGxIRJ3HrNRtmRWTZcp5hY59_klYGQF6d6puQ.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth/tcp-unix-socket-proxy/1.0.2/cf53989130986c60113032e25185f4496ffbc186/tcp-unix-socket-proxy-1.0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/tcp-unix-socket-proxy-1.0.2-KCMDGSy00Z-cM3qmB0jDR5sQSJV17Q8Xofxkoxij7SM.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.duct-tape/duct-tape/1.0.8/92edc22a9ab2f3e17c9bf700aaee377d50e8b530/duct-tape-1.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/duct-tape-1.0.8-Mc7xLd7JedH4bXz3CMQaF9pSPQXGhf1mQunQsq3bckA.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.visible-assertions/visible-assertions/2.1.2/20d31a578030ec8e941888537267d3123c2ad1c1/visible-assertions-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/visible-assertions-2.1.2-RQSulosjfNzcto_1sHqmOr5JkvkHp3w9YSCqm5BBQBw.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.kohlschutter.junixsocket/junixsocket-common/2.0.4/b4d1870bf903412533e0b79c6fcd402defcfc05b/junixsocket-common-2.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/junixsocket-common-2.0.4-r8N2Fez3-t_3TSmvtEP-T2M9OWZG2J2CXoIkoneDn2A.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.jetbrains/annotations/19.0.0/efbff6752f67a7c9de3e4251c086a88e23591dfd/annotations-19.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/annotations-19.0.0-Ev8B7q8MCcamjy7AJLO_n6TK1uaLdLlov2LH91kEcDI.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.kohlschutter.junixsocket/junixsocket-native-common/2.0.4/726bd66a934dea39c817382986496fa4eda96411/junixsocket-native-common-2.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/junixsocket-native-common-2.0.4-92O4XsFT2VMJB0dOfyBspSsocDfXBLrO3aON1cTQ9gw.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.scijava/native-lib-loader/2.0.2/1451fa03954c5e31a358b411147de472b4dab92c/native-lib-loader-2.0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/native-lib-loader-2.0.2-5WfHHp8_9T94vVj9a6bUcc4x4SY_XofR4fzF0-2h4kg.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/5.5.0/e0845217c4907822403912ad6828d8e0b256208/jna-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-5.5.0-swj66_5O1AnehBDgpjLRZLISawNfbqz_lo05CMr7TZ4.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/5.5.0/af38e7c4d0fc73c23ecd785443705bfdee5b90bf/jna-platform-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-5.5.0-JNgWIfgqwp_N2adBFgMfWQeiNDFY5hb0Vzu_okNK4NU.jar
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 183 files cached, 37 files newly uploaded in 1 seconds
    Sep 26, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 871a62452919f18072d69f7132e2dd9510374869de9a39a2a6b41cb7213349be> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hxpiRSkZ8YBy1p9xMuLdlRA3SGnemjmiprQctyEzSb4.pb
    Sep 26, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 26, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-25_23_45_21-778638351766346219?project=apache-beam-testing
    Sep 26, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-25_23_45_21-778638351766346219
    Sep 26, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-25_23_45_21-778638351766346219
    Sep 26, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-26T06:45:21.387Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 26, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:29.450Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:30.626Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:30.659Z: Expanding GroupByKey operations into optimizable parts.
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:30.691Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:30.768Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:30.797Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:30.829Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:30.852Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:31.373Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:31.451Z: Starting 5 workers in us-central1-f...
    Sep 26, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:53.495Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 26, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:45:54.771Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 26, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:46:13.978Z: Workers have started successfully.
    Sep 26, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:46:14.015Z: Workers have started successfully.
    Sep 26, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:46:45.447Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:46:45.614Z: Cleaning up.
    Sep 26, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:46:45.722Z: Stopping worker pool...
    Sep 26, 2020 6:47:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:47:26.924Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 26, 2020 6:47:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T06:47:26.969Z: Worker pool stopped.
    Sep 26, 2020 6:47:32 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-25_23_45_21-778638351766346219 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c6628144-2ddb-4f69-935e-3b9ce8f5f86f and timestamp: 2020-09-26T06:47:32.808000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.502

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 6:47:33 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 25.887 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 16s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/qvrbwhpbonohg

Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing files not accessed on or after Sat Sep 19 06:44:21 UTC 2020.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 0.704 secs.
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1040

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1040/display/redirect?page=changes>

Changes:

[noreply] spannerio.py doc typo fix

[noreply] [BEAM-10975] Remove capture_output argument in sdk_container_builder

[noreply] [BEAM-10959] Store a fixed amount of known process bundle instructions

[noreply] [BEAM-10977] Disable codecov annotations in GH


------------------------------------------
[...truncated 270.14 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 26, 2020 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 26, 2020 12:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gD0cssr_Nre30SwYn3vE7C66yFDX-nbOpX8G6Rr1o4I.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8795106637567226117.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-182YcuIBAqsbc9XXQrgt9z8lziGP2aPWrzjQ9TGhrJo.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-vfb-NjDjkw9Z6EBRfY4CCnHX5ESFNpPy8bhH2RkZZOE.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-vaHqsb4jBdoMouVhGkkLO8VJuyjcHsEn02Hfe07YB_Y.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-XzG5ZP_HT06ZU2yLejT1nSLVCWLLdEjcr7jJYcz1kO0.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-aoa7nlV7wQJZx6Ntr5ny6fj6W1KHkTzWmFX_BY3ExrY.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-RMLrVvBmihHPPQBEJN0rOxs0BwESBlBhJ95owhPbASU.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-JML0ThKyKX8agBUT1ujDf8Z3swYoWjs9TiShalNGoUE.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT--huXLTjAanmoM_YVOGQpeqQjcwvDxaHNh8c2kCj_JKU.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-L9gC7xj4Sj_N6Ss-PJ9dQhQYPH4SPsFwo8dC7m2p-j4.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-Xk0S3nbeJe3IwRVMGmEYgY7j1IghiH-svmO___BXIgU.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-GON0UK1rHGubzwPJH76vNjflb8_6RROzdZ-cGRC7SZ0.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-z3dXZQHhwk7MO2fuCwNYPdXZyUo_9-LI7fNsTrtyOpU.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-RGsa7WI14w_XSif2erNBOLpAKTaKWwiG3xoSr9RVstw.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-hUmucsJGN_dGSvAdaMjYAbAtq5NAJYNaMmBCcigyXs8.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-bUYNv5dg1Ys9B4j7bf-LIqPwJm-tzeBpl6i9VDapT1g.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT--th_JXrcpBw_2JLD1-XWmFAd7ah9QUFCv9Hw_uEIxMc.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-2VL-IUR3jxFRWzEdkIbvGQ7OS88c4k3htnRAUwdrZ6g.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-NE7aYXHgOEakpiFy7KCxoLN6WPJxL0ohfYZ7vEmu_Jk.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-cG_dqu0kfy3QjTSwF1zawkPdUZ8AdG68R-xoH2AJSlk.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-fhSZPawiM2A1UetpFnaqvgNS1UTpHP5Az2jZB3yq290.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-Ab3RYGmXmHnzzq46fziiHYU0F7NuADI8JfsLoGfvqKg.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-gD0cssr_Nre30SwYn3vE7C66yFDX-nbOpX8G6Rr1o4I.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-NxPkraHyJAdVcI-Y40vKnop38HbS_Ru93V-2ES1aDw0.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-OCmzxJ_4Y58RNJ5HoHwu4Mp2BB271fxKXeGgMUX3TvE.jar
    Sep 26, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-SoxvWIKHeeUe1wbsLzYoZ7fCVSDj5QW3r5VWlle143Q.jar
    Sep 26, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-2H-S1pf0IDD8wA5EQ8ZpoEl4IP8NadB9ZE_LAlbKMI0.jar
    Sep 26, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 26, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash ffeae5bedc95e0a4611cabe589a0197c4e8e28768f3bbcad3ee8e632205a1cee> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-_-rlvtyV4KRhHKvliaAZfE6OKHaPO7ytPujmMiBaHO4.pb
    Sep 26, 2020 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-25_17_45_18-3924713643940463119?project=apache-beam-testing
    Sep 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-25_17_45_18-3924713643940463119
    Sep 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-25_17_45_18-3924713643940463119
    Sep 26, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-26T00:45:18.419Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 26, 2020 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.079Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.733Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.777Z: Expanding GroupByKey operations into optimizable parts.
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.818Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.878Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.918Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.947Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:26.995Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:27.368Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:27.460Z: Starting 5 workers in us-central1-b...
    Sep 26, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:58.794Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 26, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:45:58.909Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 26, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:46:16.306Z: Workers have started successfully.
    Sep 26, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:46:16.343Z: Workers have started successfully.
    Sep 26, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:46:54.013Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 26, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:46:54.215Z: Cleaning up.
    Sep 26, 2020 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:46:54.351Z: Stopping worker pool...
    Sep 26, 2020 12:47:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:47:49.554Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 26, 2020 12:47:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-26T00:47:49.610Z: Worker pool stopped.
    Sep 26, 2020 12:47:56 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-25_17_45_18-3924713643940463119 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7556ddf3-b0a6-4d3e-8959-dcbd9fbab40c and timestamp: 2020-09-26T00:47:56.128000000Z:
                     Metric:                    Value:
                   read_time                    19.008
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2020 12:47:56 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 51.503 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 40s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/eqlgksxmcbzde

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1039

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1039/display/redirect?page=changes>

Changes:

[Robin Qiu] Update release guide and script

[noreply] [BEAM-9616] Add RegisterDoFn (#12903)


------------------------------------------
[...truncated 269.54 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 25, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 25, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test68209336544983503.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-R7pM3kHKQ4IhBo1WX8noSLgXgDwI6F6oOReZ5G2O30g.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-2G7S5KEMDYiE_UXE3XYOo_pQ8iBdPYW2gMXD0vwq028.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-VonJKzQBO6CeHcKqgFZGZLYzUewrL3MYMGLM5Phg3N4.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-zNw86IOpi7BeF2xLs3VTviZOQhymHWVZ4uLhAwBTCxg.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-kw3QtIKJBY-B0crw7JtI5muTxlMiDal-9Va-X2tA9vY.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-npZE5ffDe-UZpgYkUL1ypv-L8ZQC8ife2AlubGIJJd4.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-TEjAmnDl4JUMAFyZ-2ZsK_Plj-K3XPxnBTFWqa0Akls.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-59sgTLv1FpAR5LU8mPQLfvWQXXRLNxzsLQcvjAVJkKU.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-8lOaUZ8qEifHhQhzocuMrEtNqBtbdR7o4CIi5Gi6kKI.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-poH27Z0Btqcy-7gmNT4bs2yucL90xoaWIVX99ZaVhB8.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-VrN8-u0wTtVFw5oLC9w-xzMO0rJ1BaPBzU0Gs8HfgLM.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tX0mukQ_WVA2k1P2-IUZe06mpTLTXHz8E9TOmjPTtfs.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-59sgTLv1FpAR5LU8mPQLfvWQXXRLNxzsLQcvjAVJkKU.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-mPV6HPfEkJue7HIPOnl2RScjh-HPael2ABM0WDHdAI8.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-MyCtXh1Zoh2dkZRYkJnJ4cZVCbpsHnkOofOr0pgVNm0.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-qhar9h3ENYJxn6sJPE9YYzfCFyshkOiN2QOzO5GWNFM.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-2vph_3W-ZB1wdcoiom2FW_vxqZE6dtPqfjGbXODvEmQ.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-2kymy4nG1sUrigstw9u2Q4vytwVHJ2rkMr2JOH_FN8I.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-dGstqqSNR6QbFHUgPdxRGEl-iban9BqxUQKXJNGZ88U.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-oFwZ_eElRFjLSrJ_IUWNop_HU41a2LDYmqple6UAofU.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-t6E3pqN36TCkdRJ3PWMmtOYda1A_PaLQfPJEt8cBfvE.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-FaCkTV9QL8g4QSG7fgLnb2t80oDrcxkvLIYwWfbPI8c.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-GIddA4cFJ1fcTjIyLM7Ue6rUnoraan8KOvjH-c36Lmo.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-9WIgyOqxXL2ETM4SvKIS3JkZeNWCiJiFDWN_uhtVrWM.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-59iIkDUdnkT8Ygg1YLAFawtqd4ypWBa4mmrI9Gx51vU.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-XC76sFBLgNoWsiTaj9L2ZQZ-5GbOv0M7mLq6zUeGSCk.jar
    Sep 25, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-v-ilty99BFTM7uC6sATiXj1YQFFC3Bk_strnS0ZYBKc.jar
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash 4ebdcbe622e77920de0dd3a91d4224ac4d38c400266cef855295de0b1a232227> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Tr3L5iLneSDeDdOpHUIkrE04xAAmbO-FUpXeCxojIic.pb
    Sep 25, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 25, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-25_11_45_20-2334548006027065858?project=apache-beam-testing
    Sep 25, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-25_11_45_20-2334548006027065858
    Sep 25, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-25_11_45_20-2334548006027065858
    Sep 25, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-25T18:45:20.864Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 25, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:29.439Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 25, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:30.775Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 25, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:30.954Z: Expanding GroupByKey operations into optimizable parts.
    Sep 25, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:31.030Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 25, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:31.138Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 25, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:31.191Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 25, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:31.246Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 25, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:31.300Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 25, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:31.936Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:32.047Z: Starting 5 workers in us-central1-b...
    Sep 25, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:46.242Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 25, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:45:56.835Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 25, 2020 6:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:46:14.619Z: Workers have started successfully.
    Sep 25, 2020 6:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:46:14.663Z: Workers have started successfully.
    Sep 25, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:46:57.349Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:46:57.535Z: Cleaning up.
    Sep 25, 2020 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:46:57.645Z: Stopping worker pool...
    Sep 25, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:47:50.002Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 25, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T18:47:50.061Z: Worker pool stopped.
    Sep 25, 2020 6:47:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-25_11_45_20-2334548006027065858 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c9c6edab-67c8-4a3f-9e8f-8125dd2bc494 and timestamp: 2020-09-25T18:47:58.911000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    23.286

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 6:47:59 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 51.584 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 42s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/zzdpyf4jbf2ic

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1038

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1038/display/redirect?page=changes>

Changes:

[noreply] Change when streaming Combine on Flink should be fired (#12931)

[Ismaël Mejía] [BEAM-10759] Uses reader Avro schema to deserialize in KafkaIO


------------------------------------------
[...truncated 273.46 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 25, 2020 12:45:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 25, 2020 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-8JnFHP8oFwlt_LJFot2yxAETnsTPktPZ_Xmt1967uA8.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-Wa31o9CyBamZ24K5YU8C0xO4w-o4YEiOfXndOSzRoW4.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2292524598700075263.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-YS-j4eGwzi_tIRebvZj3a7-8ij1jQ6xZ2IVOS15IX6g.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-UjjvkMsF5n8UiNJ57I8-QvnvNJbzJvOzzFealA5OyxU.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-T5nbIHVcYSKxvvo_rQbfJlXWbDq_Iq7SAWfL1vbeYu4.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-9RmGtv4glmz14hs9DvqLLAfvaOztlfxU76B6xUACvM4.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-aCnTRNm1uTWA-V7k8YL9jYXNH9GKt1g3bk7in0ZWadY.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-fxnB8qT5mDxAl1dqA-d3MRQTaPAvS-WGpR0mY8ZkmUM.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-c6JZn8Wrrm_Y9spUEJigaF2DEDPYMwntdgqz3iZTDR0.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-31akpKo2TW0Cm_6GqPI69rGEVlRxEaPwdM1_IZCy6FM.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-rK6SV9n0CQf-OcDUjMRMsVWPJdXBYjQNK8HT84V_6SI.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-8JnFHP8oFwlt_LJFot2yxAETnsTPktPZ_Xmt1967uA8.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-U81NL-SV6Ydr15FxU3rYp-bWXhkKEb9cGarfEdhmmT4.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-Nii5dPPKVpgZj2lvYsy2vN4XOx5zaysqg3XpNTxc4ko.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-0Yn0_zHTkYapPxW7TZeCvOr7HSp-pPAb1K33mstiUtE.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-PGWIVluDZvHpsrjIjV2ksopNinrZEVm5bj0mFBvjUkc.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-Lei8q-w2a3JY32nCCgVGMhfTC55HIA81KwZjAr9Olm0.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-mxDyhX6RF3C3nlRx50cyoZ6LfNeNnRT8XM4Pm1Nak-I.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-qTCAvghx5qRyVOmYld2dXrpmFOjio2PT3z2TJsq3K5s.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-EtxFlMYmzl5Wmvub_a33FslWdG2wuWu-e9rV9neu77w.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-QkQDKq8eCAMpinU9EOd6UwIVXeSIz2geke4N4x05Q0Y.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-ijY7AO9o4emFhq3VkHLAUe1zT6xwWJBiv77k24zL-J0.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-zBPtQahBhWypgeaDV0OsEdsUtvTziBkPUvWlGOenNEA.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-rz6j9t60DX8uISgdSp3yw6rxv81OxjUnRYY0sI2LYvY.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-tenTa5wxh-hrRqqNREYR9PmPnHMwiKo4OAjpSltD1kw.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-qQ4lUA8SOV04bd7kdUnxOADmo76JzXGzC7tX8aNQvWw.jar
    Sep 25, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-ExR9PvUkKcwO1LDaHbknol3TyemYE6C-H0EerUUuFnQ.jar
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash b606bad74f67b3c3f1b00d966f103a73e71da95198a8dca2bb4e44ee3ace2780> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-tga6109ns8PxsA2WbxA6c-cdqVGYqNyiu05E7jrOJ4A.pb
    Sep 25, 2020 12:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 25, 2020 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-25_05_45_40-4874489163137861910?project=apache-beam-testing
    Sep 25, 2020 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-25_05_45_40-4874489163137861910
    Sep 25, 2020 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-25_05_45_40-4874489163137861910
    Sep 25, 2020 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-25T12:45:40.797Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:47.955Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:48.777Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:48.827Z: Expanding GroupByKey operations into optimizable parts.
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:48.861Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:48.940Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:49.033Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:49.070Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:49.102Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:49.447Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:45:49.525Z: Starting 5 workers in us-central1-b...
    Sep 25, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:46:00.559Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 25, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:46:18.351Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 25, 2020 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:46:35.581Z: Workers have started successfully.
    Sep 25, 2020 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:46:35.615Z: Workers have started successfully.
    Sep 25, 2020 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:47:08.495Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:47:08.656Z: Cleaning up.
    Sep 25, 2020 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:47:08.743Z: Stopping worker pool...
    Sep 25, 2020 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:48:02.610Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 25, 2020 12:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T12:48:02.656Z: Worker pool stopped.
    Sep 25, 2020 12:48:17 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-25_05_45_40-4874489163137861910 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 40b209dc-7110-4488-aae7-b1bc92ce7f25 and timestamp: 2020-09-25T12:48:17.525000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.857

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 12:48:18 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 50.659 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 1s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/jkb2dh66hhojk

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1037

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1037/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10960] Fix DropFields does not maintain the original fields order

[noreply] [BEAM-6103] Enable BQ streaming insert timeouts (#12893)


------------------------------------------
[...truncated 273.51 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 25, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 25, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 25, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-4awc4viSGQj6vDkDmtGxg717QPCKYEco98KVX7rNThk.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6501801575398705674.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-K2GVQ15aO51MIvn9fD0nS7eXkBbOFB0yr2w3PNjykHg.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-QC1zp6oVCD3ibRCR5UMZlcSNtGHakD5pxm34GK73FMI.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-t8kXh3TPlXP-gQJnUI84uAMwCKMJ8u_cPl7RH4BJp3Q.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-8u-kTOWsnAS-EvK1K8JRkA3xPPMSqSpH5-t5Bv7MwYE.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-hXuEhVV9pdbwPr4YACa-6VcVtubmuQ2rZfGiyyGTXaE.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-dTFthZ50bUj0a0fQNLgmYE6j_le3kKkNgkG0LBSmh4s.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-JFNZeNPh2d089zyXejW4YPj7HO9tz9zQnOJcNPubGhA.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-zN72JffqJUH9rG2QkM_COspXLDCIAZTrCKX0f5KAb9Y.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-7ZqTaUtKiAy_c5G1k09TlA6KW4bnDats129oQDJdks4.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-Mdle9oMO0Bne4JosjCS5Da4lGWFcC7n2nXINs12e6iQ.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-ppaCylyo6zuMiAOio60nUZ9KyOrlLGpWJ12aBthrQnA.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-4E2hCnmTkCXy7n9ZkCnzLUaw-qZcox3Wv-OZZjEl9wU.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-WwhtnYbU-90HLM4ifHs9cFiHEXhmXEEHc4pa9_oVCSg.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-OlD5oEq_xxBfJec8hpooLrt2ANJRXPSFKhmyBPr0mW4.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-50EVHwjcQXlfMdvUu2cQUV5lIQTKl-fEuZTt5u1KoD0.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-J8QZItIVvcOnoLyWyzYc5y7HwDyuNzLbJf6vTQSlSuM.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-t8kXh3TPlXP-gQJnUI84uAMwCKMJ8u_cPl7RH4BJp3Q.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-Qs3QkSu-V0qaCrSi7s6n2KIZMGDeTmyqeN63i0alTAg.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-DB_3psvC9vzp0mRrOASHKHucpXKmx0rNFQSb0mwZGyY.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-NMY6NoDERz6eQrsbbLvGAx50SU86s113xbv9Cz6oCTA.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-inZbsR1D2W_xYdTOqnWFRr0sGclcAu3tvNxFnNFPKjg.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-PfPo0XhEVEgSjs7y0UkarHiM9LcfPnt43PV1fsBYikE.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-C0jpYq4O_JKPNp-TQAuyiJ9kpZYKBzgYejVtEx1xhG4.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-hidlS6As_Lu-bunUjVKBcBKxOvenGbjBltbzJoESb2E.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-aOfJpiyGMtb4cpSFza-I9bZQW3vNDqVFSWV_GSgx1aw.jar
    Sep 25, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-mH865L8aFLd3XnoUfqLu8saFjLj1sh5z6xi2iPgqaDE.jar
    Sep 25, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 25, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 25, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 25, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 25, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 25, 2020 6:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 25, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 84ae51a8899a7f93578ad47bd3b511094b69dfd646391522662bdbb03cfe0bc1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hK5RqImaf5NXitR707URCUtp39ZGORUiZivbsDz-C8E.pb
    Sep 25, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 25, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-24_23_45_38-11740137979377362187?project=apache-beam-testing
    Sep 25, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-24_23_45_38-11740137979377362187
    Sep 25, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-24_23_45_38-11740137979377362187
    Sep 25, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-25T06:45:38.228Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:47.151Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:47.769Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:47.856Z: Expanding GroupByKey operations into optimizable parts.
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:47.883Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:47.987Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:48.023Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:48.061Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:48.142Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:48.713Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:48.786Z: Starting 5 workers in us-central1-f...
    Sep 25, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:45:56.159Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 25, 2020 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:46:19.346Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 25, 2020 6:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:46:39.131Z: Workers have started successfully.
    Sep 25, 2020 6:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:46:39.159Z: Workers have started successfully.
    Sep 25, 2020 6:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:47:12.696Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 6:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:47:12.976Z: Cleaning up.
    Sep 25, 2020 6:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:47:13.118Z: Stopping worker pool...
    Sep 25, 2020 6:48:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:48:04.996Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 25, 2020 6:48:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T06:48:05.064Z: Worker pool stopped.
    Sep 25, 2020 6:48:13 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-24_23_45_38-11740137979377362187 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8a6e7baa-8aa4-4aa6-b371-db07eb2d86c2 and timestamp: 2020-09-25T06:48:13.285000000Z:
                     Metric:                    Value:
                   read_time                    14.507
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 6:48:13 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 48.698 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 56s
107 actionable tasks: 64 executed, 43 from cache

Publishing build scan...
https://gradle.com/s/akns33g7bzc5u

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1036

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1036/display/redirect?page=changes>

Changes:

[milan.cermak] Adds unit tests for the parse_table_reference function

[noreply] Merge pull request #12918 from [BEAM-10910]: Validate the BigQuery table

[noreply] [Minor] Typo/grammatical changes (#12849)


------------------------------------------
[...truncated 294.50 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 25, 2020 12:53:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 12:53:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:53:07 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 12:53:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2020 12:53:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:53:07 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 12:53:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 12:53:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 12:53:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:53:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2020 12:53:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2020 12:53:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2020 12:53:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2020 12:53:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 25, 2020 12:53:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 25, 2020 12:53:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 25, 2020 12:53:22 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT--A7ooheg7dDDB_Whh1cloPDao7O1tkWEox1IysFraDY.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-2tbebiE0_vFtkYBUqDJelgK72nccy1qJFZuOOOSS5H8.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-bbnIN6iG9BRQ9svTYvgb-J7FmOLuRwqImsIhboZ2FMk.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-BjsLxu9Kr4mgUZleFd_JeOBFc7wCXG7BxYmQc10RQTg.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-Zoemnav-llwPY2xo3Tbwm0i_ew-5TLvQpVUeQbqh6Pw.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-DQraQZmLN1YnuB8AM6tOt5J10ovIJ6SwdkzQK2h_GOg.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-JxrDDDUNK2AggHnjylAz65_O2fNp6Snthok2Tr5lv2w.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-iSoB1TbxWoCcjAXffnVAIJp-uw-UUuTLpvDNOuu1q2I.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-NvxPRYfqAcIVyrtyHy8vaOb6ntdFmDc6E-lR1DDf31o.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-LLdacHRx18nvU_deCA4mwHwld34beqVrPK5X840Azsk.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-wl9hfF5NSU1AiI74JdTAo_HkSONS_VStbWhb5VH463Q.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-BZdSUGfhbYYdzLd9OSLadHalgguONUC9wYESsf67H80.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-rnAD-8eCmlhHXkgcCvhGal4TSZqgQhT9OXbwWjpXibM.jar
    Sep 25, 2020 12:53:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-LSn490j5MzLXYuujMwJPDhebhdxwCUL2Hym_8ZFxXyk.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-0fMY0M9_robLU45AhpZ33ZzcJL7YSwg1QV2XJsXWXi0.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-Rt_p9MFwI89xWYU-1IsctLPwakxFHRbStpOuUt8yss0.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-LFaz_hhU6qL4DOB6QIJu1IudWYovAo4etTSohnCtzyU.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-tests-uQusBpAIbMo_i-OXUVE5j-ZlnoGVeTaq29jjXi3E64Y.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-g_tC0BTMfGWoI0Dv0ruhD_nSOf03IeXtlgb-cGjTnNc.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-RKmHDiZZiNtxBIPsbniTeT7u_lG436il8ByVrI5OUq0.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-UTN-nnLuGdY7E0ZIFeehHX8JY04D7ZIeJj2D7yb8aiM.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-iihFdtufQND_I9xaXadQD7eHnv2K8dEZWnDZl_dQgO4.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-co4nzRpL6AOCtx5dLxf-BJb77yLEAQ42-CV3bhOApb4.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT--A7ooheg7dDDB_Whh1cloPDao7O1tkWEox1IysFraDY.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-QsQNDfjLzyiGHezPuNOwnwyMNATmwvwB8yhWaVQFGgc.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-TSJ8uAJaVshFW5saUUPUp8FQp_GmqxkjgY8LkgS84eQ.jar
    Sep 25, 2020 12:53:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3920304788997613909.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-9QaFcvF2c9I6mtrcPfbUmAzgJp4WLW42qN3pp7-TkiY.jar
    Sep 25, 2020 12:53:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-GGkvTZeNyELMKKww-FjaPBY3uuJjnI2S7oMWxgrD_q8.jar
    Sep 25, 2020 12:53:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.26.0-SNAPSHOT-BeGNvit7d9EdTfYHYktkn7rE8duZpVSJ7Vhd1DhCSvc.jar
    Sep 25, 2020 12:53:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.26.0-SNAPSHOT-hzcpkkk-kkiMcfh8GMc7X6I5_uq6lLDcwN_jn335P5M.jar
    Sep 25, 2020 12:53:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.26.0-SNAPSHOT-GlFi8hO7A0FKshT9-QJcn9gsghobCqebsvfjZmwDk2c.jar
    Sep 25, 2020 12:53:30 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 190 files cached, 30 files newly uploaded in 7 seconds
    Sep 25, 2020 12:53:30 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 25, 2020 12:53:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 25, 2020 12:53:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 25, 2020 12:53:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 25, 2020 12:53:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 25, 2020 12:53:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 0b38c9101dcd5a72d262cbb1455ae70c1df381b7dcbf749721a65b0dbb470364> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-CzjJEB3NWnLSYsuxRVrnDB3zgbfcv3SXIaZbDbtHA2Q.pb
    Sep 25, 2020 12:53:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 25, 2020 12:53:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-24_17_53_32-3328448307700009274?project=apache-beam-testing
    Sep 25, 2020 12:53:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-24_17_53_32-3328448307700009274
    Sep 25, 2020 12:53:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-24_17_53_32-3328448307700009274
    Sep 25, 2020 12:53:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-25T00:53:32.404Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:42.526Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:43.455Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:43.496Z: Expanding GroupByKey operations into optimizable parts.
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:43.523Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:43.599Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:43.622Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:43.660Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:43.683Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:44.157Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 12:53:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:44.234Z: Starting 5 workers in us-central1-b...
    Sep 25, 2020 12:54:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:53:58.909Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 25, 2020 12:54:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:54:12.144Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 25, 2020 12:54:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:54:31.586Z: Workers have started successfully.
    Sep 25, 2020 12:54:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:54:31.621Z: Workers have started successfully.
    Sep 25, 2020 12:55:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:55:05.656Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 25, 2020 12:55:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:55:05.795Z: Cleaning up.
    Sep 25, 2020 12:55:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:55:05.861Z: Stopping worker pool...
    Sep 25, 2020 12:55:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:55:56.232Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 25, 2020 12:55:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-25T00:55:56.270Z: Worker pool stopped.
    Sep 25, 2020 12:56:06 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-24_17_53_32-3328448307700009274 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c02f32e4-fcf0-46e7-91f5-561adda2e69e and timestamp: 2020-09-25T00:56:07.086000000Z:
                     Metric:                    Value:
                   read_time                      13.5
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2020 12:56:08 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.098 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.138 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 3 mins 43.007 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 4s
107 actionable tasks: 66 executed, 41 from cache

Publishing build scan...
https://gradle.com/s/5yhkdexspccuk

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1035

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1035/display/redirect?page=changes>

Changes:

[Chad Dombrova] [BEAM-7746] Add type checking to coders

[noreply] [BEAM-9154] Ensure Chicago Taxi Example is disabled on Jenkins (#12929)

[noreply] Add to_pcollection example to wordcount_dataframe (#12923)


------------------------------------------
[...truncated 269.14 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 24, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 24, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-chMhFqMxMPZLTav2LsNvG-NNb0VCNC0_vBEfoTslyuI.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-chMhFqMxMPZLTav2LsNvG-NNb0VCNC0_vBEfoTslyuI.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-HJwRA5XbFuwgX3efzqwaGcQYGQkW6p15Np05y1_v2e4.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-DoUYwo-d6U_ip5Z97rpPyJmnzwxy_f4kq6jH-yYgcwQ.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-a3ZOG3KcWkqs9l1Sv2OG3SPngZKdF-I6XDu5c0kqNCg.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-OAhP3Xo-iOZNW-1a4XLLjr299ZNDoMCpftDzHvGddEo.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-YQB1lGe1tJtGn51EgfEl_UBtJNBLRTnnVUINjS8JISo.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-yRXcaSXaDZbrb3Km5HDX1XMp0U33qM1Sq3sOyP1oONU.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-7C0pJZER1nqmbWPsXNuYu3rDE_HAL-iZE6t3LTNwzaM.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-FOkdBrllTJ2xgvMWHISzwNUWaQcde4KaVp68x5pfKmw.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-gNDe8ZPQvMedpC_Hcb_9exyITXI9-GC0NC-_arnBEdM.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-7tFo4ss52ukXcWWSVOWwnyvIHpD8ozGQHPOth2-HL7A.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-Rz7vGBO2giWuGTivPAqN26ASnWMe9rND6RCG0dNG7p4.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7983317506118141826.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-zJ_N3sgPK3g7yjjtRNECYYuZt4AWh3a7068EkxnhdzM.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-kCwqakENimiFDXpyERoX33FALw0iLf2izcn0H15FkIs.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-KL-0ofa70-MML5HfNDmu0x_rGf-bk51VNYQa1NG5MSM.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-blhTgj2vI4Zk-P0gksh4QYD0GapXM4-AHiN8NJT9WVg.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-irc7V0eTkKsM4pC2rtzNgcfBIXrEk3qYwEGw1JvrEPk.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-VSlnGmHEVJ3gTxaOS7sYz6RblBeFX3LPk8Mff5Ltq4E.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-mcKIokvJU8D7TEPKXG-wO-YahRhJt0DFI6EavsPqhK8.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-VWW-CkM8oME53BdaDz1OTdDh_96VyeosebgvRaKkbn4.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-bIuqYkZUtcOpF_peWV3ZvA7XmOML11YnmKuA2yZs1zo.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-KZft39E0rrbZVFbgkc7uEFLmzC6KuDQ9FanxNLsV69Y.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-uQAsQOTwC8Y4wvfp6lNu466O-gw9NljzkhzMeHdqaKY.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-j8cb1bI_X4dEfFh09m1lZ4fkK4IIj7VUjydz-VmoEYE.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-1Imlu7vkCJ0F3KMvN8_5Sd3Q7ZdnPozO_567hR1Pih0.jar
    Sep 24, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-dh2gGTPz0pkzSyUS61yT4sNzZXSUZhr4IKS1BhvhDlE.jar
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash bf9ce98605db7f30fe3241f009482a460fa203a54c5fd777152e6d435169ef93> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-v5zphgXbfzD-MkHwCUgqRg-iA6VMX9d3FS5tQ1Fp75M.pb
    Sep 24, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 24, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-24_11_45_20-10026403084809293690?project=apache-beam-testing
    Sep 24, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-24_11_45_20-10026403084809293690
    Sep 24, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-24_11_45_20-10026403084809293690
    Sep 24, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-24T18:45:20.692Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:28.252Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:28.890Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:28.936Z: Expanding GroupByKey operations into optimizable parts.
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:28.971Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:29.112Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:29.136Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:29.163Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 24, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:29.200Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 24, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:29.569Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:29.629Z: Starting 5 workers in us-central1-b...
    Sep 24, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:45.256Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 24, 2020 6:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:52.974Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 24, 2020 6:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:52.999Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 24, 2020 6:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:45:58.304Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 24, 2020 6:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:46:12.962Z: Workers have started successfully.
    Sep 24, 2020 6:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:46:12.995Z: Workers have started successfully.
    Sep 24, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:46:46.000Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:46:46.147Z: Cleaning up.
    Sep 24, 2020 6:46:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:46:46.221Z: Stopping worker pool...
    Sep 24, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:47:41.702Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 24, 2020 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T18:47:41.745Z: Worker pool stopped.
    Sep 24, 2020 6:47:52 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-24_11_45_20-10026403084809293690 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e3e92647-7e7c-41b3-aa1c-6adf914542e3 and timestamp: 2020-09-24T18:47:53.051000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.366

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2020 6:47:53 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.045 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 46.229 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 36s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/gnrkd6742b7k4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1034

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1034/display/redirect>

Changes:


------------------------------------------
[...truncated 269.85 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 24, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 24, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 24, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-p3fLmjuguhQsTjx0JRMaq05JkyKKsfAMvj1-x81MrBA.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-Hid8dxTUQneyLAIBHlMduwmHtVlkA7gvFg15RnuY3Kk.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-96RiI30v5nb3Y_sbiXrbvHBfGSsIvexVpgrm_ErPK_A.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-x9leiPlg80HQltjz0eH7UEA8ZBDBcyNLj_N3e8_e8Qo.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-oHVoCPgMTmiUGFXFLWSNVt6NtSuXwKTovRsDka8W8qM.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-W4Rt3ZVzgBOzuQp1iVXXD4z6kZ5TMBbijiX96Pj1Uto.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-fgL5ATwPheAB7j70f4iLRCjLK3I7ZZBcUnkliA8s9OE.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-0RN0M66vTmT2mvK5vELAA5eWvA08hwtGnJ0PB2dJ21w.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-hh-Dan6wEp8pulTUCsaiQbvEex89E-podCxB2OboW-Y.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-386Lf1kumpbw8PtHy0TU5-G3blkEuaCkK3J_rpI0DDc.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-dICsQRdzzuZpzb_GdoNacpEUHCV_gXhBno9ODiEKgKI.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9023185273535501287.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-xQgQASQnWY2fq_udZZiDq-s71YT6EiTWe2g4Gbf4WHA.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-p3fLmjuguhQsTjx0JRMaq05JkyKKsfAMvj1-x81MrBA.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-vpq40Ui7MsNpDhwL6Mw-KoaiuWf5Tp8pNbDSDnxe_nI.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-7EA2EpZDaHbwjIGr0tRrozn3g0JERnONsL7c8jB96BU.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-v-TGvC18fwZ8Tj5dmEPOPy0s_Ha66wDvHPY2VnalDO0.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-1-q_Yjgd4QLKha0Gfg_VU3lTG18yuSqGCPNIWUY7Rjo.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-VR9IGbx9PbkeN5_QI1b7qghOO04qUaLZlQS7Tjh7a0g.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-G3MyBfAjxEd_8sb5PUy_9b2_t6D0kImrP6x8b8dNM3o.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-IcmIqC4QRBEW209yFu2qZ58EftJ-_r2NO7fQNXikQ1Q.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-9Y0cEJP--xsLgxmnB_gRKvr7HHoPJIFDoYD4ZaZhfQk.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-ujrSkklbnqtoXPV2oVSv2iHwM5cAB7g-hTaBN0_Q_vw.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-yuZHYe_OmCOHCrv6VyqDZp4IOqHsch96reccCLqE0dQ.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-7EyO6y2KUvVgILqTXgYGPIbo3UVSLRk22yDtu7tGM2Q.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-mJSITdn5JoIZNIDrrsvTzYzLzv96wghavqaFH0EY2FY.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-2RJGda5xvSKFSEeSdL5Diy3Lue3b0YDbp4VjGuQJyBw.jar
    Sep 24, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-3t28CJDkVJvXrhww6SyBZwLFMZTRIJig_PyhtbOYoIk.jar
    Sep 24, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 24, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 24, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 24, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 24, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 24, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 24, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash ebfdc9d8579a1d7dd80fecbd7c8cca83826d21eaa9aba67c2aff3e1c73b9900e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-6_3J2FeaHX3YD-y9fIzKg4JtIeqpq6Z8Kv8-HHO5kA4.pb
    Sep 24, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 24, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-24_05_45_25-4644725188128550442?project=apache-beam-testing
    Sep 24, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-24_05_45_25-4644725188128550442
    Sep 24, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-24_05_45_25-4644725188128550442
    Sep 24, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-24T12:45:25.231Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 24, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:33.153Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 24, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:33.928Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 24, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:33.955Z: Expanding GroupByKey operations into optimizable parts.
    Sep 24, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:33.984Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 24, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:34.046Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 24, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:34.075Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 24, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:34.109Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 24, 2020 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:34.140Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 24, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:34.540Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:34.628Z: Starting 5 workers in us-central1-b...
    Sep 24, 2020 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:46.979Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 24, 2020 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:45:59.934Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 24, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:46:18.669Z: Workers have started successfully.
    Sep 24, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:46:18.698Z: Workers have started successfully.
    Sep 24, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:46:49.105Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:46:49.310Z: Cleaning up.
    Sep 24, 2020 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:46:49.408Z: Stopping worker pool...
    Sep 24, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:47:40.306Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 24, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T12:47:40.353Z: Worker pool stopped.
    Sep 24, 2020 12:47:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-24_05_45_25-4644725188128550442 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 80850963-d7d3-4d29-aa4a-87a6c7086e13 and timestamp: 2020-09-24T12:47:50.686000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.913

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2020 12:47:51 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 40.328 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/akwlc7di2jvzs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1033

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1033/display/redirect?page=changes>

Changes:

[Robin Qiu] Moving to 2.26.0-SNAPSHOT on master branch.

[noreply] [BEAM-9616] Consolidate Element and DoFn json impl (#12925)


------------------------------------------
[...truncated 272.88 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 24, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 24, 2020 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 24, 2020 6:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ihebk17_TaRao07kB6MTMa0iWxvuQOwGrENFFG1dVjU.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.26.0-SNAPSHOT-bd-UPIlu_vnP8gTSGpn0uD6QMeuq27opeNCWRVkL9yc.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-9Q9BVFqVrHKjfAcjmV7ZgKGwI84AMmZPVSbIUrmRtRo.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-tests-VTTdJyXI1aGwiDRbdItkSAcvBJincyFIsxvYpqwNTOQ.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.26.0-SNAPSHOT-CyJ3OwDEGwXomYY9ZRknvSJNO7UNAIx-cyTxt4-PIMk.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-tests-SED3nJwEOZSU-iSS1PdvQiTp0EAcsZTYr30EK-zH558.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.26.0-SNAPSHOT-ihebk17_TaRao07kB6MTMa0iWxvuQOwGrENFFG1dVjU.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-tests-Va6dJtgowgJWZ_lE0Zyx-Mjrx3K1E-SY3SfkfwuVM-I.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-0acWQu3Hn7vOpkbuKt7Dq8jwXRhJ4ubDk0BAaynMQVA.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.26.0-SNAPSHOT-cRXW5QzPhItAwovzyJzr5ZZlyiW7-qZXrCp7sNUEc6M.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-OW-SJL6H4BlsUhiPJP0Fz_ndneBmzbuNX1URJrApxUk.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-tests-9k4fXxvF2mipXBvGjtdpN1FreBInL3f0MhUmCZdVTCk.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.26.0-SNAPSHOT-QMniAQYzyc0Vz06XU-BzQ3vM4WfvPvt6sAgxAyKOxFA.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.26.0-SNAPSHOT-Z4UXA6HOj5DHlNJg1yHR4UAozsIOe3jf9ybgK0gUg4s.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.26.0-SNAPSHOT-tests-wtHNrogSGaCPzX65IqTI7A-KRpgxekyogz6-JCC4kpM.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.26.0-SNAPSHOT-V3q_-65sUBevoMjEilp0Zi-tktKveNhcMbC_pmqTL1k.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.26.0-SNAPSHOT-7g2dEAj6RMwykzX2oqaUgMijamKIY87IVtYy-b862B4.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-tests-UuhNRZi_sQyqMsgKtPaHPVHV8T7st4LriP85FXFs064.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.26.0-SNAPSHOT-L1msJPOb-NtJO2kr9Hs7mVLAXW4yGEoL-ffoEi0Jycw.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.26.0-SNAPSHOT-U4NKmU4i1LZL_T8l_RYs1EQPvMWVbfGdtAZu1fLfUVM.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.26.0-SNAPSHOT-SdjygCR-eHpDxws2B4CLpv7ZRYW20F3aD01gam9t3PY.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.26.0-SNAPSHOT-tests-TZKqXuVWCoMdcQV1KUxMXkyDJn24k_exMzoVrk_mNKw.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-8V0FNdJf0aR5g4n8xnzyl21X0lYqKSqSVXalEsyxGNE.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.26.0-SNAPSHOT-PjgsuR-Sh8deibg41mP_vYoCV8PgZ0Y9LKGcfz2UaCo.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.26.0-SNAPSHOT-SpQ5f-MgjsxKXA6x6Ul8g1wEdgzKgWfw9TRjxs_yFQE.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test317891621245546609.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-zjkflGQA8lrSIZwTbwThX9adovRO2LwZbpTB-wgX6Ho.jar
    Sep 24, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.26.0-SNAPSHOT-unshaded-HJNJm2DgDo-jDjcXopC0CWJXet3pNtsu3ib7PhWPn-4.jar
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95913 bytes, hash 02e73f80f59c863dcf4e071daed4ca4211d43f0b84fd440eaef0ab15cbe309eb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Auc_gPWchj3PTgcdrtTKQhHUPwuE_UQOrvCrFcvjCes.pb
    Sep 24, 2020 6:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.26.0-SNAPSHOT
    Sep 24, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-23_23_45_33-4151554466615201999?project=apache-beam-testing
    Sep 24, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-23_23_45_33-4151554466615201999
    Sep 24, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-23_23_45_33-4151554466615201999
    Sep 24, 2020 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-24T06:45:33.661Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:46.033Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:47.314Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:47.359Z: Expanding GroupByKey operations into optimizable parts.
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:47.389Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:47.575Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:47.870Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:47.973Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 24, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:48.159Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 24, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:49.566Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:49.666Z: Starting 5 workers in us-central1-b...
    Sep 24, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:45:58.465Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 24, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:46:16.761Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 24, 2020 6:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:46:37.350Z: Workers have started successfully.
    Sep 24, 2020 6:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:46:37.392Z: Workers have started successfully.
    Sep 24, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:47:08.575Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:47:09.213Z: Cleaning up.
    Sep 24, 2020 6:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:47:09.289Z: Stopping worker pool...
    Sep 24, 2020 6:48:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:48:44.361Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 24, 2020 6:48:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T06:48:44.405Z: Worker pool stopped.
    Sep 24, 2020 6:48:52 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-23_23_45_33-4151554466615201999 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b6603a0c-638b-4044-bdaf-a5a6a9afae85 and timestamp: 2020-09-24T06:48:52.121000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     12.57

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2020 6:48:52 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 3 mins 31.261 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 35s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/azk2sf3iy5p7y

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1032

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1032/display/redirect?page=changes>

Changes:

[noreply] Revert "[BEAM-10861] Adds URNs and payloads to PubSub transforms to

[Valentyn Tymofieiev] [BEAM-9372][BEAM-7372] Remove Py2 and Py35 test suites.

[Valentyn Tymofieiev] [BEAM-9372][BEAM-8371] Sunset Python 2 and Python 3.5 support in Apache

[Valentyn Tymofieiev] [BEAM-9372][BEAM-7372] Clean release script and correct naming pattern

[srohde] Fix BEAM-10956

[noreply] [BEAM-10586] Remove Python 2.7 and Python 3.5 support in Dataflow

[noreply] Merge pull request #12912 from [BEAM-10938] Adds support for writing a

[sambvfx] [BEAM-8660] Use PortableOptions.artifact_endpoint if provided over

[sambvfx] Add artifact_endpoint to TestJobServicePlan

[sambvfx] Add simple test for artifact_endpoint

[noreply] [BEAM-9547] Roll forward #12858 (#12920)

[noreply] Merge pull request #12782 from [BEAM-10950] Overriding Dataflow Native

[noreply] [BEAM-10844] Add experiment option prebuild_sdk_container to prebuild


------------------------------------------
[...truncated 282.16 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2020 12:45:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2020 12:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 24, 2020 12:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 24, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-4XXipBPbqYkasCgTgYyyTFgLC4H5WTMJk9ILZI6E10I.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-PU21hyoB053NBjHcorM-7tGXVqVWc15OU0y1DmKszLY.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-jzGfkWmNjOQR1ExicMtkA1m1vfZFvwtU89CcyvIXmDk.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-3Xks6sO606B_hCoTHSGQ1f-oh3vW9T8qjCLLLCxcbjk.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-5hVGWqs6qiYrc4hfxjwtVLxAOznT8up4RUYjlFVLjGg.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-Ea5Ad82bf8RjS7JkoCgY-Atjynk1ClAE7izyqpGA2OA.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-6gBYtmvF2SSHSlEZ75exRPk0qgRLM6P4mxC3i63tqrU.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-kPNEg6UhDFi85NgL5SR6lr-AIvsiq16y0AJcg2qfYho.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-EBkvxq3zjlfQaJimBYzSOot79dyosiuUZnSEvS1kNrk.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-OSlIvyl8x9Sx6qlcvcl4fV1f780tzI2TCw27VZNQcc8.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-ipcthZIr1EcYbPTbM72jk9bzyJVngCJRnuTuZhXdWm0.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-3Vo6I6_5xNcWLHZj6o7cfOhnyIF051iuY6JOzSDyQkU.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-9ye2T-rD49YqzE9zsxPCUaukoDw0AIY8GjMJ1FNOCPA.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-XpC3WCdde3NxXVWlRnXN20ELwr0YhmRqGQ8GjS_FweM.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-RURAHSZXcwVbxnCKJdbkzPU96FAr7YL-g1fvy9zilik.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-4vN8y6u0Nrd_iUKnlglu3HwQss5EkDausS0DrEXcNRw.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-DcFRgTVK9wCKQUrBDmNo3UVvBqZy-Pp0G2u8rTr38Ds.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8574567814791560331.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-zchzAgee1ozf61jv4CQP4E5ayW7iP1MSRNpn5y8CuYY.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-igyMNAvjxh8vtrslKtGSylCxU-8rZGHaCdnQoFZsgPo.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-4XXipBPbqYkasCgTgYyyTFgLC4H5WTMJk9ILZI6E10I.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-vgNQeqZcpJ2qJzobze4IBbne5dHM-FWdIyYwV-WXGcI.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-PUiTdvBdYLrEWSNBOIL1EfvAwvC5BBmocelkdENSGuE.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-6T4_bFH9Z5APkTsxBFTrfjRYeRXOUVsu9vLnRNRuwbw.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-aGr0Gfxh9WDkCRCXZmCGT-DQcTnVnpqfzFo2Ap0M1GE.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Ztf5O4IHJ6lghO4sE58u-lUDBdT4bxA92uTFtrgfE-8.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-GROsk9Ho2JGwBX5m4luL6k39PYV1FgiLXdMWIRGF7eI.jar
    Sep 24, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-ePkJiJstM-ypfNHINUcW1YsMzCYBhWCmQCmgqdtQibI.jar
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 7ac8ff65a6dd3371ee4545dfa64d2f214306d087254c91d66f101a085a54a51a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-esj_ZabdM3HuRUXfpk0vIUMG0IclTJHWbxAaCFpUpRo.pb
    Sep 24, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 24, 2020 12:45:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-23_17_45_48-17388493341644479826?project=apache-beam-testing
    Sep 24, 2020 12:45:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-23_17_45_48-17388493341644479826
    Sep 24, 2020 12:45:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-23_17_45_48-17388493341644479826
    Sep 24, 2020 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-24T00:45:48.815Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:55.722Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:56.362Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:56.404Z: Expanding GroupByKey operations into optimizable parts.
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:56.438Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:56.514Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:56.546Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:56.579Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:56.666Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 24, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:57.037Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:45:57.126Z: Starting 5 workers in us-central1-b...
    Sep 24, 2020 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:46:12.681Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 24, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:46:21.439Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 24, 2020 12:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:46:44.159Z: Workers have started successfully.
    Sep 24, 2020 12:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:46:44.193Z: Workers have started successfully.
    Sep 24, 2020 12:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:47:19.837Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 24, 2020 12:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:47:20.002Z: Cleaning up.
    Sep 24, 2020 12:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:47:20.103Z: Stopping worker pool...
    Sep 24, 2020 12:48:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:48:12.065Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 24, 2020 12:48:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-24T00:48:12.108Z: Worker pool stopped.
    Sep 24, 2020 12:48:20 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-23_17_45_48-17388493341644479826 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0e76bc9f-7bc7-4f39-a5df-e27b01e5f573 and timestamp: 2020-09-24T00:48:20.118000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.468

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED
    Sep 24, 2020 12:48:20 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

3 tests completed, 2 failed
Finished generating test XML results (0.053 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.072 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 46.2 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 3s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/hl3fux3i47v2u

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1031

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1031/display/redirect?page=changes>

Changes:

[srohde] Add the ib.recordings API

[srohde] Fix macos IB recordings test

[srohde] Delete RM reference from environment when evicted

[srohde] Move pipeline_var into RM constructor

[aromanenko.dev] [BEAM-2546] Update CHANGES.md and add some checks


------------------------------------------
[...truncated 273.62 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 23, 2020 6:45:41 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 23, 2020 6:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-tgGg0H49ItA83tCtqME72oyUd9MFRqLimgf9yJCY5V4.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-ykpfgzSpm_ycK9xNl2aKbbKwASCg1gdZ4OX25HNadxM.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-TM-1yYXNW-XxhI4bHVq8Uu2dviAvCO2hHU-pu2UNKeE.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-T5hs_sxY5YJiH0VJEm0nNJ-LVTkMGIEJsOhi-0Ogagc.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-asGXDQ2WADVAkMTAvHflX3YCYHXabhDpryqllU4AT30.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-XMu_8wA4byr6jCRmmuyLPBH8z27GLDh4TyiuZKfY9bU.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-ylakBT9SnBZHCJPqzbgzFfX4RZp1kkWutSRXJuIueVM.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-q3bbbBbfxZbfg6GMVi7YGEUqg7pJ5z5JRAMcq_9BPxI.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-jUXYncsmM7g9zp5cxst7my3B6uwV_qZiAFfX3fNEjkE.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-ZXD8b1woDPrd_QeqoI7sXG9FdqYHP6m_Ze9K4Ghg_xc.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-mOEbUi8Ksum2f6-M-p5PFx8RKcOkrT6dK33SiOWF6_4.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-QCPZbwFngVcj6ofZVGKA94RFt2SrSsTdSP5nm0h9XyE.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-edfC4qdnH4Iukaxow9RLABQal20EVC0sfnXOSMWgmb0.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-tgGg0H49ItA83tCtqME72oyUd9MFRqLimgf9yJCY5V4.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-9kLD4KLD7FXBeOfbxKikd98LoC9uC20AiEwfbQ5Vbq8.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6455771520532046824.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-JlLCWTKgFaqrc5Lwg3WG_J9L_XA6DaZb-_ti2ihNpVY.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-cCnOaXVMTPgnQbN1wt-j_wvdURTXUKqxVs-neiqL3Gg.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-iIwTdDONzLaMGAlE9ytnTzNLYDnX05_HIhoNVHhrDOI.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-P00sGi_Gn-L7FRi8zlMXJsx1j22AnM5r7iQxiK0Jm1o.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-RPmNqGGq9RmWTrt4UrrVNW5VWSSusdSP6uPZqhZT_YI.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-sGS49OO1zCNknXQoD7toIgOi5-3q78FlfK7IJMmF9ao.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-X_GIuX0-2ajOGn6P29egAAuvOWI7nUAX8xT4bHqLMPw.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-Jeh66QJ14e86zaHT7kQ-mlqnr5yGH1KWkfMDMmgDRcQ.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Ny36a_afRM0HXC_oRQ1xXhWWxo81_WuORJKMCY3YKjM.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-GrDZodBGsKVKICopI8iZCvwgT2Cw3ej4oozntRTvNgg.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-pC_y-gb9AdeRP31s-sT3m1aj-RmARo3oLw0EQJIxPzU.jar
    Sep 23, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-tyYHE2pGpTfOkyNEtJm6B-_B7Dq6MyVepWRAxxJTb_U.jar
    Sep 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 23, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash 5db7a36e7f24a284a98e6009e5ac0e15a63d7b46a6a9c5b68b6938f6e5a9132a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Xbejbn8kooSpjmAJ5awOFaY9e0amqcW2i2k49uWpEyo.pb
    Sep 23, 2020 6:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 23, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-23_11_45_47-9177616685045180341?project=apache-beam-testing
    Sep 23, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-23_11_45_47-9177616685045180341
    Sep 23, 2020 6:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-23_11_45_47-9177616685045180341
    Sep 23, 2020 6:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-23T18:45:47.145Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:55.488Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.082Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.124Z: Expanding GroupByKey operations into optimizable parts.
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.155Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.234Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.267Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.315Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 23, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.339Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 23, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.847Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:45:56.928Z: Starting 5 workers in us-central1-b...
    Sep 23, 2020 6:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:46:15.333Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 23, 2020 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:46:24.059Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 23, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:46:45.238Z: Workers have started successfully.
    Sep 23, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:46:45.269Z: Workers have started successfully.
    Sep 23, 2020 6:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:47:18.607Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 6:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:47:18.777Z: Cleaning up.
    Sep 23, 2020 6:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:47:18.859Z: Stopping worker pool...
    Sep 23, 2020 6:48:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:48:07.849Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 23, 2020 6:48:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T18:48:07.896Z: Worker pool stopped.
    Sep 23, 2020 6:48:18 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-23_11_45_47-9177616685045180341 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5fb96e28-b024-43b3-9e90-d971b7463588 and timestamp: 2020-09-23T18:48:18.971000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.291

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 6:48:19 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 46.96 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 1s
107 actionable tasks: 64 executed, 43 from cache

Publishing build scan...
https://gradle.com/s/bxmsismad36qy

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1030

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1030/display/redirect>

Changes:


------------------------------------------
[...truncated 269.61 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 23, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 23, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-7tBzDPKF4a9eLwHownqeN8GackD_nj0OXHk1vkEfoDE.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-O2Ork-erTFjhx9LWFEAsi9rjAc2YtM5ilCL8nTxMqmQ.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-jepR-uA9dSUp00X60FrVUqcwwJEQPzROlPffOPoXbvE.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test960500431479145301.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-uULHcKVbyivY1vNmkLRxSyu8aBnnFPYFb-JNCPhsQaQ.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-c6eECWDJ4387XZwlzd5HGMJsrLm2z0CJeVo2oGeFauY.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-ysn0EfN2O9vtncOV5IqdDub_tUOOxSOYl6XeiVu_ZY4.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-dEN5sSoliby8lbhgcw2Vx_JCHrLNPvr0lb48CIoD-TM.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-NnnyRbKA8OzXia1D5LMOeVlTLEDI-Q6ddDnfFdtfIa8.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-vOjPjAw_BT1AGEfvy3eMoQ736QZ1G5d2vZH5HnyRIGM.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-LIok_62TaLjesANjiThUGH2G1Xo3IXJy12Qtww3LVwg.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-QAbzCSJn23fSHGajwCPYMAP_r3fXg2HnyAXNv7gw5iY.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-J7vXyRLCbIohmnKKYEGIB0mSZIhFog_0Hw6QGTpzSvk.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-4a9FqzKvIhhpczmTXFraCCwhwHq7BJJusP6SutM9B0w.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-5ErGbYLaxcfkOeaMEV9Lf4z8WVY-lRNfgmfQZDKg--E.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-FCwu1FWM_cjRGNnMr8Ig18lgoE4p3oSzxrJCRt0Pf9g.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-k5mdUTmPz5L-n1CUmDU3xfGb7DFT3rciL60Lt79qVPs.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-7tBzDPKF4a9eLwHownqeN8GackD_nj0OXHk1vkEfoDE.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests--Zmm9mNHxoniy7NsQAgnKcY5Baf_llmrl4fbhRyhyio.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-RpygISfPbs0_nex5Ao47eHP7Rsgp5zRHCQBL5qJ4Sm0.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-lhltfCSi2Iqz9OjcNjsFdb8FlOr-N_LLI6rnf5_PBIw.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-qHswIQZnOCwQtt_jFGGxbAHa38Sza1OLoz9743idnE4.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-RSkMtsrUd5qeZZ7Gl_iVrYvq3wLlsBQZ9MjgTsNjAQI.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-fR_pfhch9hjwfqSquKaVxpuBFKV7nIFQ7RBM1OqBSUI.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-fnGPkVFOSm494w-RCTtP35YpZ0Ts6Wxcy24D-g4h3uc.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-blpzpfdshrrCRswKpcByWKgz3p7jy3O2MyZzVmb-aK0.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-kr2qsxcHilOqa8rFivnRVMlhNK6emeJyDEvv1aOjRqg.jar
    Sep 23, 2020 12:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-_kJUKEetOd_W6HZL5VL7am7AvcbNqYVac1OHLvQg7Io.jar
    Sep 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 23, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95913 bytes, hash 4bd1cacc0f3416ca4120848c42736b58b91cd4c92e3d85601d648baf121de150> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-S9HKzA80FspBIISMQnNrWLkc1MkuPYVgHWSLrxId4VA.pb
    Sep 23, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-23_05_45_19-13033379796155648559?project=apache-beam-testing
    Sep 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-23_05_45_19-13033379796155648559
    Sep 23, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-23_05_45_19-13033379796155648559
    Sep 23, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-23T12:45:19.365Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 23, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:28.316Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:29.517Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:29.591Z: Expanding GroupByKey operations into optimizable parts.
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:29.609Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:29.694Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:29.724Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:29.759Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:29.788Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:30.148Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:30.226Z: Starting 5 workers in us-central1-f...
    Sep 23, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:55.717Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 23, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:45:56.544Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 23, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:46:16.754Z: Workers have started successfully.
    Sep 23, 2020 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:46:16.794Z: Workers have started successfully.
    Sep 23, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:46:52.635Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:46:52.771Z: Cleaning up.
    Sep 23, 2020 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:46:52.842Z: Stopping worker pool...
    Sep 23, 2020 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:47:39.545Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 23, 2020 12:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T12:47:39.586Z: Worker pool stopped.
    Sep 23, 2020 12:47:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-23_05_45_19-13033379796155648559 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 231bfa9e-2f18-4aaf-bdfc-54b84d728c1f and timestamp: 2020-09-23T12:47:50.077000000Z:
                     Metric:                    Value:
                   read_time                    15.226
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 12:47:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 44.415 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 33s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/prjgedem44u6m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1029

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1029/display/redirect?page=changes>

Changes:

[noreply] Revert "[BEAM-9547] Raise NotImplementedError and WontImplementError

[noreply] [BEAM-7372] Drop Python 2 shims and update docstring in

[noreply] [BEAM-10769] Clarify Avro IO documentation. (#12638)

[noreply] [BEAM-10814][BEAM-10570] DataframeTransform outputs elements (#12882)

[noreply] [BEAM-10871] Fix FhirLROIT tests (again) (#12908)


------------------------------------------
[...truncated 269.22 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 23, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 23, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-UidrXTNXUjqQQ9bXw7RnutXVdh3GVU0-OAbtkTE_DRA.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-6v_yX2nKxuhe-titZqRzS4CZPFbEi1IWgt2y9JMenqg.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests--JaEGVLQs8M9aQiRTfz06P6ERlih-QsY1NlrjB2aqh0.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-M1TW7i9HDSS1wK5WPi50PuqrxyIcYJD44TGqmnmNU0w.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-uQxLDq5lYydM_vpfD1bQGoP6Ahim_JnuEoCfBKnIInk.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-a5-YatShy154gE7vk2Qc3tYYGS_3x8kYDSOjBivEP7U.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-eujvq5wl82jWbUbgLp_JekIo7r9-EtU78L7yNczqj80.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-8nVfmJVnEHUJ8SodFYD0fZ_GXgbxaz0GyVe6xyqRKgo.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Nsu-nJ6pCUVT3aRbxfuKCYgh2nKqIugygrZRcprHISM.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-1SppRaa4G_OxFErdHYleB2p4oHbkr-MpuBnjQeHS2s0.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-mxzHv9r8ptWsxVJ_rVfGLV7-0qcdgz7c2cn8TOWKw4w.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-HCn6yKHHL_DY0vShljjnONefDJSZ5yiWC2kW-3J3roQ.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-56P9dgbR3AHiqGSmmQZDTFrPazViEIt6Ydxv9yZAn_4.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cLS8FC_qEoZ_Ozye-oynsstTax4N2XkOso2iYwRHdTI.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Qt0omtQacMfLiXev3zw5JExVUxt0Wp_7qzhfZMYYgew.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-qVPhCGPvYJGIbrwakoaGMJYG8ZIg4URgD6iklathKm8.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-fT_LipMc4Dc5xqdCxkhJq0_bPqJy2Ev2EkLxmW-ZKP8.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-qpD9DA1iHA7dfdaOwYRVAEmrVYnohYE2Pk6T_4SFK6U.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5085017616897959886.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-nYhBzK8chwQDhgNpj4c-b-Zs4u0Tr9B46BS56skNzoU.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-i1hfUZoB-wbz6sqMb_JOqZqGxHg1vP2ff_VZ7TRzKmU.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-9M7CrhpeIQyHDJIJWkC8uH_uONbo-ShJKZErJIiGbKk.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-WZrlCxZYbytWMxqzcwWF0b20GxiP1MrrcRSDwhPU4xY.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-nmm9Dz-29pYWXTP0wWr7VaFbBH1Yg6U2xuiSycDKK1I.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-UidrXTNXUjqQQ9bXw7RnutXVdh3GVU0-OAbtkTE_DRA.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-8nT3fw4t6NA7uQqdeRTeQ4eNV_AOfoownwq4qUZHjFk.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-G9cgmOfroZ91vdSiQsf6N2h2lNoFaGyt64I4OYMwa-k.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-PYGa99XUl0a3PejVDJJKzZWRbYmqJWpVBfpPNdcX7zc.jar
    Sep 23, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 0 seconds
    Sep 23, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 23, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 23, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 23, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 23, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 23, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95914 bytes, hash 12704a5ad4de22d65d9ebd49eed1808724784cf3200f6e27aaaa3b73e30a37a6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-EnBKWtTeItZdnr1J7tGAhyR4TPMgD24nqqo7c-MKN6Y.pb
    Sep 23, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 23, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-22_23_45_20-2325826040322952474?project=apache-beam-testing
    Sep 23, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-22_23_45_20-2325826040322952474
    Sep 23, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-22_23_45_20-2325826040322952474
    Sep 23, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-23T06:45:20.621Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 23, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:29.264Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.023Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.089Z: Expanding GroupByKey operations into optimizable parts.
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.134Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.220Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.259Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.294Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.324Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.878Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:30.960Z: Starting 5 workers in us-central1-b...
    Sep 23, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:58.976Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 23, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:45:59.511Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 23, 2020 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:46:17.413Z: Workers have started successfully.
    Sep 23, 2020 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:46:17.466Z: Workers have started successfully.
    Sep 23, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:46:49.481Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:46:49.876Z: Cleaning up.
    Sep 23, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:46:50.079Z: Stopping worker pool...
    Sep 23, 2020 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:47:41.525Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 23, 2020 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T06:47:41.579Z: Worker pool stopped.
    Sep 23, 2020 6:47:48 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-22_23_45_20-2325826040322952474 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 83e6287f-7041-48b4-b21b-85a85a10074c and timestamp: 2020-09-23T06:47:48.767000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.608

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 6:47:49 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 43.238 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 32s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/p2y5eyxinukgy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1028

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1028/display/redirect?page=changes>

Changes:

[MATTHEW.Ouyang] [BEAM-10532] convert NUMERIC field in TableSchema

[Kyle Weaver] [BEAM-10768] Sleep in time-based flush test to ensure correct ordering.

[Kyle Weaver] Move ZetaSQL UDF tests into separate class.

[chamikaramj] Sets sdk_harness_container_images property for all UW jobs

[Kyle Weaver] Move testCreateFunctionNoSelectThrowsException into ZetaSqlUdfTest.

[Kyle Weaver] Add comment explaining flush callbacks.

[noreply] [BEAM-10894] Support for more pandas formats. (#12844)

[noreply] [BEAM-9547] Raise NotImplementedError and WontImplementError throughout

[noreply] [BEAM-4091] Pass type hints in ptransform_fn (#9907)

[noreply] [BEAM-10716] TestPubsub/TestPubsubSignal clean up subscriptions (#12830)


------------------------------------------
[...truncated 271.68 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 23, 2020 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 23, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-HuInn11HjBtS5NTYk9PbUW2qd0kTIdvEoNRbU4gx1Tk.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-2ZyeHhzUxWjP8BZgZkhrr4Km4QYr_NMBZYnxqtTiJCA.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-HmPZNyCTvXyg5jeneJO_Zxd2M5yf5LcNb5SCr6aOPbw.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-mSpXSA353AwY4brJB67llOxJULvvCamft7-Sb6NF5uc.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-J-CFivQ93YVOkA3JbyLrBTqxVxKHAfHz85WNRz5o9Hs.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-SaoHghgpNWLNHg_0HUuKhKGjoctBi3kHMYOgKz6bC4g.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-vsvkWVhzUNjEnnXbaWDn6DkmU24XjrkiCeYEYMAKlcY.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-fLKFyHlyGeBQbPIWY2eOWoERuLU-F0KpThOjq0hlQu8.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-anRG4bNAlZJ9LO1pR4R-oFKsXsde-JiRAnAOgZlWl1E.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4207008138718935195.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-EjyfTIpojWnXkxru_yJ81mNts1Pm-95QZKL0hsqF88I.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-02hOU4dDfiEMlsGsx12KTlzhFMXjrNY-OYOwkjkOUQ4.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-EqH0PcN9405t3iSjkTSQxuNiTJIRGA_DSylEHnyflyo.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-baSQ1Y4hIxLzIzHwmUo7PdusymKJn0H3YoK_BayiLw0.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-88q6hUw_ffSAaL72UB6b-oXO7OZoNHJ8roeB2s-6Gik.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-GPIKdYwqAiPqtz11AC3sVYVETa6VrpJSsR6t5eptgjg.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-jyi4-vGr-O-xfC5Aj4d8JyC3oTDRM9cTAIAz2w_lk1M.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-5nnGK7HhaIEsq9VyPBea8GOz2y9l3zDc1kc5Tl0fi38.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-IwG7yFyDYuo43wJM2edbp_fqOIB0d6HY69EpHAp6BWo.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-vILBY0IOSHXwq09IYdPGpvgeZhnJgxRGY3HtHwFo978.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-ySgO2mqHMivAwsb3pIF72xA00_Lv3R9H_dTbVnDKtQw.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-HuInn11HjBtS5NTYk9PbUW2qd0kTIdvEoNRbU4gx1Tk.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-_e1L1CcAiQmYBCtoRZZQ2lxpH-77ceLARuIu5g3_yQE.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-ysL5voD6gMwgOpFU1P7JJ0-kgstbErMYpGWmn0XSv2c.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Y_z9YBs_JUNc6FwpJP8KzYhhzB87TAVPGA1-SyKsK4c.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-WDfZ9PHTvV-oEgMuVblJefo9npyZtDQ9e7SKEF3mmeA.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-cV7lusKnm0MUIyjJUE6gk4lhjUTDl3htm-hphYv1ayc.jar
    Sep 23, 2020 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-jRO7MHXAZLP96sVJ35veIFR2AGKWeyfEtCVophDOeCM.jar
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 1 seconds
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95916 bytes, hash 5b7378cf1cb8d4d28812b549b6093145d2db5e1f8741fdb38abf3e4b7ef7c9df> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-W3N4zxy41NKIErVJtgkxRdLbXh-HQf2zir8-S373yd8.pb
    Sep 23, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 23, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-22_17_45_37-12309088467168574468?project=apache-beam-testing
    Sep 23, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-22_17_45_37-12309088467168574468
    Sep 23, 2020 12:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-22_17_45_37-12309088467168574468
    Sep 23, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-23T00:45:37.624Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:45.686Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:46.220Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:46.329Z: Expanding GroupByKey operations into optimizable parts.
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:46.405Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:46.484Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:46.507Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:46.561Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:46.581Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:47.321Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:45:47.430Z: Starting 5 workers in us-central1-b...
    Sep 23, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:46:05.005Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 23, 2020 12:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:46:18.165Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 23, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:46:37.236Z: Workers have started successfully.
    Sep 23, 2020 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:46:37.267Z: Workers have started successfully.
    Sep 23, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:47:10.258Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 23, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:47:10.516Z: Cleaning up.
    Sep 23, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:47:10.680Z: Stopping worker pool...
    Sep 23, 2020 12:48:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:48:08.883Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 23, 2020 12:48:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-23T00:48:08.936Z: Worker pool stopped.
    Sep 23, 2020 12:48:16 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-22_17_45_37-12309088467168574468 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 97c4ae8a-b803-4cfa-b564-096dd07ad0b9 and timestamp: 2020-09-23T00:48:16.875000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.221

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2020 12:48:17 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 54.413 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 59s
107 actionable tasks: 63 executed, 44 from cache

Publishing build scan...
https://gradle.com/s/lscmfg7nvyqhk

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1027

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1027/display/redirect?page=changes>

Changes:

[srohde] Add record_pipeline and clear to RM and fix duration limiter

[srohde] Add comments to RecordingManager

[Kenneth Knowles] Upgrade GCS IO to 2.1.5 and Google OAuth to 1.31.0

[noreply] [BEAM-10871] Fix FhirLROIT tests (#12902)

[noreply] [BEAM-2546] Add InfluxDbIO (#11459)

[noreply] [BEAM-9680] Add Aggregation Min and Max lessons to Go SDK katas (#12861)

[noreply] Merge pull request #12762 from [BEAM-10948] Ensuring that BigQuery jobs

[noreply] Merge pull request #12900 from [BEAM-10941] Use standard sharding


------------------------------------------
[...truncated 284.17 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 22, 2020 6:49:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 6:49:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:49:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 6:49:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 6:49:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:49:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 22, 2020 6:49:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 22, 2020 6:49:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 220 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ooh0YdkglWSusU5ON1p6tHH0uE86JvAJN89Flh5sVNQ.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-bHC3ZNwhRDrw3rd7CBnWhkf6_gO1Jrw66l8Q9UNYEqs.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-311YmK3zYKNAbLS4VcZlunRJsNQ34ck8NBCYe8iM1_E.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-l7XM8XcPwWCpV-tJNNQSAdfJpvDw--vYfRbJYRz4hoM.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-zyysBM21P72j2XPPYuRh_j2wcGdKJXSwrtmDQe8GI94.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ooh0YdkglWSusU5ON1p6tHH0uE86JvAJN89Flh5sVNQ.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-mMiWP_Maoe19IoDGXyhe0Epx-qUz43TF_xiREIzsEOc.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-5OBRzo2CFsOPEhIlPru0RMMPIfIUdE5bc1Gi4wBozd8.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-enxgzLhRBtHLFoljY5xheKNShA3wbt6LxpoM_OTzY9s.jar
    Sep 22, 2020 6:49:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-S1LbWrZZZMoGPyburKne6WOTRgB-MWztJytjjPQgMl8.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-8n5XLDYLDgB3b7SCrHsGKshwC2Xo7Zt9Y61jMo3EWqM.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9191983806847350017.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-S8b5Sn4qljGfxh9r--lL2M56YsVvaF75nd51luYn1sE.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-PITwZ8JOx-3N5yf5knsj-G8pnackcDao6cqe-2-GyLY.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-wMsh2US9FCGgK56oeei2Ysaqdeqbe0QFWweUr4kzpok.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-LYR6dmf5FDNYmfR2-dr7xT3_-OYVVvTqnBRjiGBYfpM.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-SvJAYRVVT-5THd-G1rc0vLp2vEVnxy7DV2IXXjdBY3Q.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-D5T1WAjhNbtHdAMpIK34dWwKOUwXCwKTosMXHEbsiJU.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-zFB36ixE1pjnoQDIMj9KViiDqanb0FKrt1p5OqPkHZU.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-oOsDvTbcSIbPh8Jpr0Hdlo5_cnZ6B8SPcz4n2HX6Jy4.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-if7gjSxCvW_C-piDJQQr1YF917dkIoDBRaC0dh54WjM.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-itcgPEAZz4sRRHfqAZBMep5VtuVIaE8eRkufByQtax8.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-4_dDNw27TkSF1miuYY9YLoeGdjhLa8SBz9T9Kw0ru24.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-_mCVuKrhLMObVpTLYJ8irJpatHR2lurbsiT-MT8myFQ.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-NH7-bHqJOiqZFqsfizj7V2WqiebmtmIywf1CWxdzCcs.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-dgZ5DWrGj_dpCl4dmbUo9x8RvuFTDoHiul_QZ6yEnw4.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-eqVScTn974sbpdrNxntQdGJO2yrA6SHRTnrr7hV3h9s.jar
    Sep 22, 2020 6:49:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-WiXLrFdpvUHINCB1DFGzVMvFNQ0ZhDRfNhBgIptoHoY.jar
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 194 files cached, 26 files newly uploaded in 2 seconds
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95917 bytes, hash 4ab6dd47606d7f14c4e61ce938b059086ef4edccf2e46867435eba9e168bc4da> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-SrbdR2BtfxTE5hzpOLBZCG707czy5GhnQ166nhaLxNo.pb
    Sep 22, 2020 6:49:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 22, 2020 6:49:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-22_11_49_14-18164303575540618902?project=apache-beam-testing
    Sep 22, 2020 6:49:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-22_11_49_14-18164303575540618902
    Sep 22, 2020 6:49:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-22_11_49_14-18164303575540618902
    Sep 22, 2020 6:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-22T18:49:14.030Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:22.014Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:22.804Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:22.856Z: Expanding GroupByKey operations into optimizable parts.
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:22.885Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:23.022Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:23.051Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:23.077Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 22, 2020 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:23.113Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 22, 2020 6:49:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:23.600Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 6:49:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:23.679Z: Starting 5 workers in us-central1-b...
    Sep 22, 2020 6:49:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:31.134Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 22, 2020 6:49:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:49:53.267Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 22, 2020 6:50:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:50:11.270Z: Workers have started successfully.
    Sep 22, 2020 6:50:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:50:11.300Z: Workers have started successfully.
    Sep 22, 2020 6:50:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:50:46.061Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 6:50:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:50:46.266Z: Cleaning up.
    Sep 22, 2020 6:50:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:50:46.359Z: Stopping worker pool...
    Sep 22, 2020 6:51:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:51:38.087Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 22, 2020 6:51:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T18:51:38.131Z: Worker pool stopped.
    Sep 22, 2020 6:51:46 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-22_11_49_14-18164303575540618902 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 45fb1d89-ebcc-40c3-b9a6-7e1820ee8414 and timestamp: 2020-09-22T18:51:46.978000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.149

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 6:51:47 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 54.272 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 10s
107 actionable tasks: 69 executed, 38 from cache

Publishing build scan...
https://gradle.com/s/xno52dh363boi

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1026

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1026/display/redirect?page=changes>

Changes:

[piotr.szuberski] [BEAM-9898] Add stub with imports to apache_beam.io.snowflake

[noreply] [BEAM-10916] Remove experimental annotations for BQ storage API source


------------------------------------------
[...truncated 273.40 KB...]
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 22, 2020 12:45:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 12:45:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 12:45:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 12:45:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 12:45:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 22, 2020 12:45:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 22, 2020 12:45:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-MFuspS7gr09U1LeNzenqwyFfVpIT9ijJzKKjlmPIHEQ.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-3xdgD14D8e1bmBzwvpQ62D_rjQ-ClI2M_1EJ0B4xn0M.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-vMdvWMJcBnZ8twyOYYSyGq18RUoiO4AEDQURJQVZqNM.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Ommkk_FcQd_3ljeZ1SR6rAvwr21NPfyD8l9zj8hyp-o.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-YAjGgpX-Y24Oudevwv8bFUv209wYa9zWN2P9ifu24E8.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-EYdnXbIaSRLxy6gSDEeBm9Rdr97CfUbLDIezZW3W_Q0.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-k5zmcUpkC1Da7Dl4aixZBnJ3SOwZA229OL6cNVUJk8g.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-yastOkDpOm53T-uXHSI4TDjTUZM_MUTeFINCHM2cIVc.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Ommkk_FcQd_3ljeZ1SR6rAvwr21NPfyD8l9zj8hyp-o.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-T-rZbsnGpTb6sY_VYyYOnTDBEbI2zyNcvnPx2mx4-dc.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-c4F_Ith1wgb8iz9fT1pqJ1uppQM-YA89otjaYhrKBUM.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-c9TOwdFfFiRyaONv2gaqbsWv2gKPRuomnAsW8wBHCG4.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-NaE1iWtK7wyF_9MtogBJaDP2cDmvkN7-ELy2GBIdaLQ.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-W4GJOYbVo5RIadIeKLz8RDHWebVBHROTFtSSndMbRv8.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-4Ubi5yD6uNGCmuzLTm3cohwacTbojNVMq4g9RkLna0E.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-jOQJiwe9g5Gbuzoo0MBL3jZZ-N2YA8HmVwnieu4c01E.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-19Cuy7OCRoYLj07BXu5x93rkH1gMK-981A0xNvJ58Y4.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-UEqx7pKcucL7Oirph1hbmx-Ew-z0o5fvW-HOEUP_3BQ.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-KlEFZL0iZo1va_eF5y7GyFPlBYsUdmUv3NftqONTRCk.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-dgKlF_mt2iEaOJ689S7Dr3SKQI3WXKbVPx4AqY7pvY0.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ShhRTHfSV1_HrL_BBquXA0-JWnzmibbvbaCzhPTq9yQ.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7448176562116324052.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-L32Gy0Pw3X5TNFcf3nFpHrlQhk4Zamr5i3CJCtUse4g.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-22h8rTF7OJeQYKtYKZdLYeUqULQnaclb1DvZAFh6FDU.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-ZY7cnM_NmutL3eUq4xV4vypEwTc3WnXarZJz_AbxKsA.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-I23XE8_OCq6dBuqGvJNxHPraEHSTvV_hgfYaqHbsZMc.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-B7M88wo2ZSgKD3QdonkXnm2Q_LfG-aJ6XO2SH-X7fXk.jar
    Sep 22, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-cr5bW0LGM8ArnW6dMwco0U5TAu8f1K_cUJgPQ6yD3hA.jar
    Sep 22, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 1 seconds
    Sep 22, 2020 12:45:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 22, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 22, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 22, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 22, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 22, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 542c406eabd851b8eff11e77c1319d5ac5d14180a16e2433920b80e94d9fa75d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-VCxAbqvYUbjv8R53wTGdWsXRQYChbiQzkguA6U2fp10.pb
    Sep 22, 2020 12:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 22, 2020 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-22_05_45_53-5973426972305216413?project=apache-beam-testing
    Sep 22, 2020 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-22_05_45_53-5973426972305216413
    Sep 22, 2020 12:45:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-22_05_45_53-5973426972305216413
    Sep 22, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-22T12:45:53.456Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 22, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:02.026Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:02.759Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:02.807Z: Expanding GroupByKey operations into optimizable parts.
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:02.842Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:02.920Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:02.968Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:03.002Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:03.049Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:03.529Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:03.616Z: Starting 5 workers in us-central1-f...
    Sep 22, 2020 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:31.728Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 22, 2020 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:31.767Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 22, 2020 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:35.280Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 22, 2020 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:37.109Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 22, 2020 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:59.338Z: Workers have started successfully.
    Sep 22, 2020 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:46:59.379Z: Workers have started successfully.
    Sep 22, 2020 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:47:29.242Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:47:29.380Z: Cleaning up.
    Sep 22, 2020 12:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:47:29.484Z: Stopping worker pool...
    Sep 22, 2020 12:48:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:48:27.380Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 22, 2020 12:48:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T12:48:27.424Z: Worker pool stopped.
    Sep 22, 2020 12:48:36 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-22_05_45_53-5973426972305216413 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e1358db6-e85f-4814-9c54-a014675fc354 and timestamp: 2020-09-22T12:48:36.115000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     11.89

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 12:48:36 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.154 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 58.468 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 19s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/7seyb5vpgcaie

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1025

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1025/display/redirect?page=changes>

Changes:

[Chad Dombrova] [BEAM-7746] Add type checking to runners.pipeline_context

[chamikaramj] Performs Dataflow specific pipeline updates before creating the pipeine

[noreply] [BEAM-9372][BEAM-9980] Makes the Python version in Flink VR suite

[noreply] [BEAM-7372][BEAM-9980] Cleans up Flink precommit VR suite definition and


------------------------------------------
[...truncated 273.48 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 6:45:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 6:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 6:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 6:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 6:45:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 22, 2020 6:45:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 22, 2020 6:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 22, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-B0FcwKXYt5gXMBG8PMdwhG3dJZsol0tShHgxhhwPpVQ.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cCXB9vb1LUB2kEGz5QTT9cd75wGWA9phiDoiVEvaPhY.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-2yDd7vOAwrPECFworp8CyxudCpvFUTy_piAYZVnc6d8.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-8aGNbFSR676Ub-rOUCiJLRVAYVu6V0OeGSi1VFU_gS4.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-ISxaJTsKYP9C97tw7v05QFrjEtHVYOzsJbV2a3tOSQE.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-B0FcwKXYt5gXMBG8PMdwhG3dJZsol0tShHgxhhwPpVQ.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-GQaZNdb6rjVsX5l2xRKsDReidLvSW4uM8vVjc5hP1pc.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-pCPqoe1M8n7rpebjKqq0-UjBBWiDpjcnr36yoDkho9Q.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-kaj_m0ERIktVmsAT9cuewCtsJbY79lcUejEmqEBIGjw.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-UWc04qPcgiOjzvC0GrUeO27YMhvjueVwpcF4A975m3o.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-7YtuyWLAV6KlBbgnAt00EZwC3dH-kNvd-KuvTvXat7U.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-AMyYYK4FDILaU5scmZNs557gKjlFw_-WMZShDpT8Rvs.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5367325929738414304.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-rm8ZwpC1cOk4tNHh9BzkiUrk8uJLsYj-Nw0fWjvti2Y.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-r_-gfDObr-rjXy4i6ahfMk3zInyc_41UjMG-OqPQSIg.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-SzNQ5BvuVVBcxRkKly4pGcwWO7VFvvEyjOpR4KWneJM.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-o1ExzQhqftE527ZDzHNHckNHvjz-BsiFTAT9gWMMcVE.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-jrzp_8NMwRE064wr2kTbCuP7C69Vy8Ibz1ZUhBYdSeA.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-P_mTew-DCz8BCbaqDkB1kLpPkODRdL5rA9zq-iISxBE.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-bc4nMcUn8HejhSQVIhvkn1TmGYXZ1ykfs6xEmhKItJA.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-1lUgX9VWok6xkiBwBLH8cbRMEATi-7D8nF3G_BAX3UU.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-awQzwBIGdskFT0vPOIhBsktzGKZzb4xIvDpFlz0ORBw.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-FZLjeW23SJ3Qv0WCq116pnT2JbFoFm2AvKoG9woi9Cw.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-_AVTMh70TfsrS0Bz2rSyPCaXMNu4COxCxO38Avwd44o.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-m-pLm5f6ndINcxKueEh-_yjmlT1iztI4KrqeP7j3bvY.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-PYSz4T9NUHk4vqJkY12StYyfr_sCQv5J81PU23XQf3I.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-1L_qepIP0J8zm1Lfbe5XgBlL6Y8oG3QHAfSs14ennhw.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Sj5hdW4zmfG_2bjGNPwig4F-0zEkXY2WaUQjyJpwl3Q.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-IBxKspjTBSznZLbIVQD--Y2mxdqiRmkQQgXgpgb9wOM.jar
    Sep 22, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 192 files cached, 27 files newly uploaded in 1 seconds
    Sep 22, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 22, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 22, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 22, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 22, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 22, 2020 6:45:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash d8759abd6ecad4dcf9d27834ba49cdd6c52e8090d467e6a01113fbebec3e1354> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2HWavW7K1Nz50ng0uknN1sUugJDUZ-agERP76-w-E1Q.pb
    Sep 22, 2020 6:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 22, 2020 6:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-21_23_45_46-2154910386073476390?project=apache-beam-testing
    Sep 22, 2020 6:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-21_23_45_46-2154910386073476390
    Sep 22, 2020 6:45:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-21_23_45_46-2154910386073476390
    Sep 22, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-22T06:45:46.586Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 22, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:54.563Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:55.692Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:55.728Z: Expanding GroupByKey operations into optimizable parts.
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:55.756Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:55.852Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:55.880Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:55.913Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:55.940Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:56.282Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:45:56.359Z: Starting 5 workers in us-central1-a...
    Sep 22, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:46:16.696Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 22, 2020 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:46:25.515Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 22, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:46:43.276Z: Workers have started successfully.
    Sep 22, 2020 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:46:43.306Z: Workers have started successfully.
    Sep 22, 2020 6:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:47:21.167Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 6:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:47:21.321Z: Cleaning up.
    Sep 22, 2020 6:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:47:21.417Z: Stopping worker pool...
    Sep 22, 2020 6:48:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:48:14.066Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 22, 2020 6:48:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T06:48:14.116Z: Worker pool stopped.
    Sep 22, 2020 6:48:24 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-21_23_45_46-2154910386073476390 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 776bac6b-d863-4a68-85ec-7cf945f4be0e and timestamp: 2020-09-22T06:48:24.933000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    18.213

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 6:48:25 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 52.344 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 7s
107 actionable tasks: 64 executed, 43 from cache

Publishing build scan...
https://gradle.com/s/g6ivnq3phs66w

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1024

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1024/display/redirect?page=changes>

Changes:

[ryan.worley] [BEAM-10564] Support more Avro field name formats when mapping to Java

[lourens] Support for Kafka deserialization API with headers (since Kafka API

[lourens] Assert the deserializer method with a Headers argument exists and

[ryan.worley] Test new mappable field names

[lourens] Introduce a kafkaVersion210Test for testing KafkaIOTest against

[noreply] Fix broken link

[Ahmet Altay] Clarify Beam's use of semantic versioning.

[lourens] Let the kafkaVersion210 configuration use a resolution strategy to force

[noreply] [BEAM-7372][BEAM-9372] Removes Python 2 and Python 3.5 Postcommit jobs.

[Kyle Weaver] Clean up CHANGES.md in preparation for 2.25.0 release.

[noreply] Update indexing skips for pandas 1.x (#12896)

[Robert Bradshaw] Dataframe wordcount example.

[Robert Bradshaw] Update changes file.


------------------------------------------
[...truncated 274.45 KB...]
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 22, 2020 12:45:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 22, 2020 12:45:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-kPOJA-6QBJ9D85NLtQFlpn6rnviygF4dUqbpLB8aL8c.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-BwCRWLAz_egrwTaZka7k8Nme9PvjiIe9kTCj2Uz_aMg.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-UuxM10lE11CJV27JpKn69EclBr4hGZPnwOmaoYEoJ2Q.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-fdOWdW7DZaCPFxJkyWneRXjdarnYzOcomkz0xsBPGe4.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-bsUlK2oetWC1esWPIb3WlM6KzaHp0lEHhCuzZZvNX_I.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-JuefbEXL3q8UpKp9Nu1QF87frifPtqFqCXsgOG5gL-0.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-6aUoQk88yBL0hpiGunmQdlHV0X551PMIuqd1C2b0Gw4.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-ePgWWqPGmp9kJm5CPwVNME_ssFkvb5XHnVFL58FKJJU.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-NEJ-pWOU6c9YdB1rKFLpitBppgKZiTyZMn2B-qXviws.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7445096408310920317.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-kfp0vBqKRMQPQjAfwAzRr15s1eVShrB1NTjVjVV62a8.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-nbBjVraRucE8ROUbnZoKebO5MOwcFY1INzG--kKqkjw.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-MqbQUaBhwJtNcJcfLTebqmuVgd4gzG2LVdgMh5d5lBk.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-IFSCCLD4rN0RWNVJzUknfM3Xkaho8Qw-VkNdexgZ9KM.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-VQHepCkUWgr2VngVXDnMHGr0fqZWsKhPkOe9l1yeCIw.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-ddH3pS0Vts3cfdWc3s5LBPWPEv2lePQBe_kEKwC6rHg.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-LPKXZdlxB0KTnTBLJaPMONbpCb30yFjwhE2Va-UjHMs.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ND_fvMKa__1hmBwW7NWq5zoMK5-zsypMPZev6LmvCp4.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-Id0HqpjNj1Jx_rDd26kUtZAZuKgWpo4iSeNpcG5iZQg.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-klSfoXKX90hK8LY2xsLoR29TqkD4x_u1esUlyGX1L6I.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-taCHLv5ZaKG9qrh_jM7znCuVycUcOkDRlhBDtFfyd6o.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-8HvQPHexjH3a5hWFbHfM1NvUbZ28o3JdsoZNns32Os4.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-4u7vE14XFXxjbe3I2ybdLUveORPuQZGcBkUz8PhMBQY.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-qu4yMABQIaDC2knDF5RxClcW6D4Eh8jPYZ85yvVW_Bg.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-kPOJA-6QBJ9D85NLtQFlpn6rnviygF4dUqbpLB8aL8c.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-eFWQjkfD8JC6I3XThjsIHtVy9XrBp-veEgDArCo9yRM.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-Am2uLOdksTm1KlXfBiNFYl1rLnBK23kt4cmtV7jc9po.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-bwC1BFTYp0jFDd86tdkPCyfu-QbUv7NfZ2zwP6KCNT8.jar
    Sep 22, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-PBGBrWvdyDnrtRgWgPC3VTZQ8OeCnYmbMr1kpZsEgag.jar
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 192 files cached, 27 files newly uploaded in 0 seconds
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95447 bytes, hash d71e6331a611eb02a54972c539a32cfd04d5741fc041ee3d86cd831185a5dab2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-1x5jMaYR6wKlSXLFOaMs_QTVdB_AQe49hs2DEYWl2rI.pb
    Sep 22, 2020 12:45:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 22, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-21_17_45_53-13556326403802827349?project=apache-beam-testing
    Sep 22, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-21_17_45_53-13556326403802827349
    Sep 22, 2020 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-21_17_45_53-13556326403802827349
    Sep 22, 2020 12:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-22T00:45:53.626Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:08.361Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.254Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.293Z: Expanding GroupByKey operations into optimizable parts.
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.334Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.409Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.441Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.479Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 22, 2020 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.520Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 22, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.892Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 12:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:09.972Z: Starting 5 workers in us-central1-a...
    Sep 22, 2020 12:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:30.725Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 22, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:32.713Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 22, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:32.752Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 22, 2020 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:38.084Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 22, 2020 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:55.214Z: Workers have started successfully.
    Sep 22, 2020 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:46:55.245Z: Workers have started successfully.
    Sep 22, 2020 12:47:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:47:28.393Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 22, 2020 12:47:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:47:28.556Z: Cleaning up.
    Sep 22, 2020 12:47:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:47:28.610Z: Stopping worker pool...
    Sep 22, 2020 12:48:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:48:20.518Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 22, 2020 12:48:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-22T00:48:20.561Z: Worker pool stopped.
    Sep 22, 2020 12:48:29 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-21_17_45_53-13556326403802827349 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 03704b27-bbbc-4d1d-8938-20b81197a580 and timestamp: 2020-09-22T00:48:29.886000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.572

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2020 12:48:30 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 52.271 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 12s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/imyddhr2zig3m

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1023

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1023/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-10931] Remove obsolete ZetaSQL precommit Gradle task.

[Alan Myrvold] [BEAM-9136] Add python dependency license CSV for license URL and type

[noreply] Merge pull request #12721 from [BEAM-10871] Add deidentify for FhirIO

[noreply] [BEAM-9154] Disable Chicago Taxi Example on Jenkins (#12886)


------------------------------------------
[...truncated 272.37 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 6:45:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 21, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 21, 2020 6:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-bkkesV_hyu8LYeXaVwKgWxSkbL-CG8XrBczCoPiW2FQ.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-bSeayJLWxBw_cu5S_fxumPwbUOvSoDzMCUF4daVME6o.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3183716132999712181.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-RQmS9mDRP9rX0gPXLOgEB8_odFP9Vgos2TnDs5-WU9M.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-QoJ0CRh46Heyb5FNMYLPy7i0ajscBQS7u0YSiffD8iM.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-o_UEZLiKNFQT3f1oMHKSg2XRQSyIgk5QBHFG_bw696w.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-bkkesV_hyu8LYeXaVwKgWxSkbL-CG8XrBczCoPiW2FQ.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-kuENJLzkXAD1wmqb62oaT5MRgDVZq9GgkKAbT77-8WQ.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-1rzuNmzPACJGr_U2CEL7ivj0yp580JM_ym4uP_EX6LA.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-DpnjcgPtd0jAJKYrO7qiprVC3c4CkV0FKUPrGJN3_sU.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-gVYYBaI7BffboAhcKm-mp-IPXhgACgwJ8KBRDqBdpjg.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-8twWdu5UiILrH8o1NmJ1o01TH3m9f34GiRRhgJuMw9I.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-i9k6RAhRvwXP8suSvj3L4-W1w-HBwPXtp1tTlnywAjI.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-c95YYf44KPYn8wrvLmwrviZB7PwY_TtFLN7YwEAvVEI.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-JxVIPkLwIQCXMNFhQGmNzpEh_JA_lfep_ttTVY4bBqM.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-HEk_TNdp5P4UKkB3MCn5cS3Wvjah0jJ7VejLFvGCNnY.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-NA8cvWFs6oPiHwTKJ2SzPcb0miRQ1iYugZovCAeYlI0.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-G_KQJ8SCXxS-NAfPfJ4rFRgwrEzLYsiJo4bIVYc3UvY.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-K-Qwng6DCyyVrJD6EwRM3PTnXEug9Bhvgcl44sBtnvo.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-GBqu-jxHLLW7hIhNzufuwgxaaFZK67TubDKBElDYNZU.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-RRdynZRXQakLI2t_RYObRZ-irr-CT0_kBYDmacHP8Yg.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Eym5FiSbRn58Qf2Aj92pN7PuwbV4s1UqgKYszqhNRas.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-oTYJ-GNCrkLDG9Aj_quFiW8eDiVTqPlAdK6FKYUAHnI.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-jMn6re_yXJ0Pl9ijzkGP_AsKTjZzzTkrr2eEo071Xrc.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-OV3taEl43NSFk5yE_rXeDmzXJERZGMoWA8RCduUgbAc.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-uRc-5em724zdllpmDcpH04X8mBBh9efO52ChbUXSpmc.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-YlejEEKwpJky8qA1W1SnjXJagvGHeYXUgpgmYAXigwQ.jar
    Sep 21, 2020 6:45:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-wGhqmd7CyFCl7W09g4ntP6M3VwwmblGRjUOtDXBvDTA.jar
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash cbb2688456fd0f2bce71368671012480a982acae83516611b33fcf560c59ed2a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-y7JohFb9DyvOcTaGcQEkgKmCrK6DUWYRsz_PVgxZ7So.pb
    Sep 21, 2020 6:45:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 21, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-21_11_45_44-6589137125573011016?project=apache-beam-testing
    Sep 21, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-21_11_45_44-6589137125573011016
    Sep 21, 2020 6:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-21_11_45_44-6589137125573011016
    Sep 21, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-21T18:45:44.937Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 21, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:01.195Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:04.758Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:04.825Z: Expanding GroupByKey operations into optimizable parts.
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:04.861Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:04.977Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:05.031Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:05.069Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:05.105Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:05.642Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:05.748Z: Starting 5 workers in us-central1-b...
    Sep 21, 2020 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:28.507Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 21, 2020 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:36.005Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 21, 2020 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:53.785Z: Workers have started successfully.
    Sep 21, 2020 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:46:53.819Z: Workers have started successfully.
    Sep 21, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:47:29.614Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:47:29.952Z: Cleaning up.
    Sep 21, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:47:30.119Z: Stopping worker pool...
    Sep 21, 2020 6:48:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:48:24.578Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 21, 2020 6:48:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T18:48:24.673Z: Worker pool stopped.
    Sep 21, 2020 6:48:32 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-21_11_45_44-6589137125573011016 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7db6a064-25fa-4af9-b9ff-1a1443e14bd3 and timestamp: 2020-09-21T18:48:32.866000000Z:
                     Metric:                    Value:
                   read_time                    18.514
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 6:48:33 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 3 mins 2.846 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 16s
107 actionable tasks: 64 executed, 43 from cache

Publishing build scan...
https://gradle.com/s/uxep4jr7ieja2

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1022

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1022/display/redirect>

Changes:


------------------------------------------
[...truncated 270.06 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 21, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 21, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-nIbf0K11fob-oDCqBbNssOu1qyfTkWsjJrSFze-C5mg.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-_eBV5rgLLYDIi-W2P4nXVdkeMMrBHVf1PcC5X59UOKk.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-DOTV-Q2KKRXvQWUzjNOOau0LcPO7G_yMStXdSQgegPs.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-97tPqit7xvPmHjZjWLhS2tWJlIz18RK73yqV97l391k.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-VnbLbE2bRDYHAGmmRHge-0NOJkiGeDIu2zCOh9xN_H0.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Cg_zHlIOJSOYdOpHgAyt24M6aQ6VLQpvjV8ERWd2MIs.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-ByF4p5X2ppRHEBQuVS3ocGVP55Y0HWmRK8okQ8DSOAA.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-9wnRt0kI5SNUncqIcZfedAPGxV2dRMlu8ZWH1_hu8Rs.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-KGtpx13mGKZwSTfE6XiodK3CFED3Amgd08XQd_c0vQc.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-5h6MJjGJmjIEZ90tMTH6y2sVHjd9_cjJLK19h_vgQ5w.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-_tZbLjk9LPVYNB6WEyGiSIlgmdlflPJN7u2IG4vABvc.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-jk9autvdyJBBUc8MVGY2I1hLZmpu7aBbRRdAkpYTQHg.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-OLlOWlGnq0hvaVAtompUNUPzLS21UXhzdTRDv__Q4IA.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-CnKKXiusrBiXx25FM3xoRnPY7sH6nwpJFPBZieNsMNk.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-GZeMiOA7AQB9dE6sj651exvh2RxHmCtcoeOWRuA1Bqc.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-ETF0M5siNMneFR8FjJcJTdjxncvkzO7wNogyKJoisRc.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-9IY_ii2NmZSt4QoUN5pAJDMtlE4pSye3IrhTBEL47PY.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-rp3tDzWR1JS0vK31gYCPx5pwUpIqe1h4le7LMMpwwx0.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-xoKA_tLutVzQtP1N7IRBRsPg00Bu_AOl8chsPM6Ljew.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1377021143645443993.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-GbRFDd8pM7ntURXlr6H6iC_BrvRFWO-aHzveIQwG8D8.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-xVTavVfLFOomQ3qaLGxZz3W0mRFibfvQIzG6S0belhI.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-wcguua7igx-wkTMXyhsh-i5VGLmcLtgKTbKOoPW91vU.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-nIbf0K11fob-oDCqBbNssOu1qyfTkWsjJrSFze-C5mg.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-t5QsrFk17M5Cf_4eqk33Onso9TNpNJm0UzySrlEK1As.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-N5lKUC7o40_K-2u2AgfVEfYr8nDx9jlJZ2vVxjQ9NEM.jar
    Sep 21, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-Uj0GB5YU8VhBlOBrNzazXOvyYzPV1D-r0LvrXpWezP0.jar
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-P0kNsIlF1CNArOAKsNuVX-9JLgKtRpnusy1Pk26TTKw.jar
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 21, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash d2c06ad998d89a28854c83d08d4d22eaeaa0739b9ca89520b085a870989bef26> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0sBq2ZjYmiiFTIPQjU0i6uqgc5ucqJUgsIWocJib7yY.pb
    Sep 21, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-21_05_45_25-14451516347427614091?project=apache-beam-testing
    Sep 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-21_05_45_25-14451516347427614091
    Sep 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-21_05_45_25-14451516347427614091
    Sep 21, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-21T12:45:25.248Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 21, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:33.157Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:33.908Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:33.947Z: Expanding GroupByKey operations into optimizable parts.
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:33.974Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:34.047Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:34.074Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:34.109Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:34.132Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:34.451Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:34.531Z: Starting 5 workers in us-central1-a...
    Sep 21, 2020 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:45:49.054Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 21, 2020 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:46:08.781Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 21, 2020 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:46:08.817Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 21, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:46:14.225Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 21, 2020 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:46:29.246Z: Workers have started successfully.
    Sep 21, 2020 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:46:29.281Z: Workers have started successfully.
    Sep 21, 2020 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:47:03.631Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:47:03.776Z: Cleaning up.
    Sep 21, 2020 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:47:03.855Z: Stopping worker pool...
    Sep 21, 2020 12:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:47:55.790Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 21, 2020 12:47:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T12:47:55.838Z: Worker pool stopped.
    Sep 21, 2020 12:48:09 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-21_05_45_25-14451516347427614091 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 802438df-ebce-43bc-ac70-f3015c0f3982 and timestamp: 2020-09-21T12:48:09.452000000Z:
                     Metric:                    Value:
                   read_time                    14.016
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 12:48:09 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 59.705 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 52s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/pxwr4rz7ymht2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1021

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1021/display/redirect>

Changes:


------------------------------------------
[...truncated 273.20 KB...]
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 21, 2020 6:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 21, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-6fTx9uywdJQrIUF6MxAdHir1lilFtL-6yTgIrPyOMWo.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-_-sNR7aFlZTEIsLmnsqOOxZ6q76g3Pj7ppqzKlKPFj4.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Om1H7ZUPVgwTj0hrKoi_YqY-S9V2j9s1Be3hm2gE4q0.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-4Bmif_KUfzJtoSs5pmltku_Wn5TH8LhuJXX_hDn9wI8.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6552647750856024480.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2fWDVH5X_14a46gLhnuyX5Ywl5KFf5n1rw-bHzlemRU.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-refVSHzfbHj0Sliqmkh4U9IdNGhDWhWKpC2F2r3orWA.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-QilmWCc9ssWcY_FYPXlcPb-gPhSodk01MytSXZGpP9M.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ZAc4M_-pdQ-56acn5R4etDP7G29_CGO9yT4QERDqSKk.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-CrZ8pcKoKxN2eZcyq5w422guuqwM5hZhGSS6QbbtkvY.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-E_9ds6EfR5TscZK-RfDKgQFXTkHpTMEvCPUCLrsuysY.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-6fTx9uywdJQrIUF6MxAdHir1lilFtL-6yTgIrPyOMWo.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-TWr9aumuJhet8FT_xLYGH95hLV8exr6WH-6wgzWVliU.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-HNyU_i2r_pmIc1awLvhCqKsl9a32oVcVfbSYib0njzU.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-pLH3Pwk3JvkXv-tcEIPyV-xTxR-_eTo2lnNjvhDBrkA.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-zRHeN2tPNO53cDzXoifrUNe06TMlqfrpDHZ5XLtbUdM.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-12MrU96Z3-2tZg1j_b3C_jZ6RNXpnJPPv4O34b735QI.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-WlcxWIK-85GmeHirXDiTlkHI_vmgdbp0zC7uv5tRLe0.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-2lho7iOIc0ZXdIFGZKZDBOehBiCi1g3wt1NvA8hcfPI.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-TnxGoQT7ozEpQe-B7qFElaD2C9ddkhg0S6Oo4_eSLdI.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-J5A91O2K3-Int1VmChzgrvUh8QAtt3BVeYpGsFVUaG8.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-DNIGkA-1Oc4aZuP3vqBOBNveEMVebn6-Q2tWC2PwB20.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-L0cP8lst_WfIUyykhjGb5XK-SmtAUeE4Ev18YYnVtpM.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-jjKcQ57rYp2t8J8GvafKGA8csZLZTWVEILSeO02l4Uk.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-KcB-Py9_4cK4ultbhMTnAp7bDuaB6Z_x70-f1IjkGas.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-a5P0JPSDAyU-lPXUMiTHDfdS3IKb6s4iPey3up7rikg.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-3tmnxz7e1ypyOf8pXXNkPi8SmVbi9b84SWXGb4Nhr6c.jar
    Sep 21, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT--6SvoK1lTkUkQq7_teHVCNHXpNxShoJaOtz1DlMmxzc.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.mongo/2.2.0/781d14f4e3d9eeb0b4c3e00a4ec165a04b3dc5c4/de.flapdoodle.embed.mongo-2.2.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.mongo-2.2.0-vNy3lJC0jW9u4Cy1AHsqSbjRUqOTX9ycpEmHkht7vvk.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/janino/3.0.11/e699e368095379ba0402ea1780a87fcaea16e68f/janino-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/janino-3.0.11-kje3HSMpGA5ZIQ6aqhAO4xNFTvCuWIYIx1yxkxlZG-E.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.process/2.1.2/986b38302fa10018d5baccee7f655c31ee9afe5b/de.flapdoodle.embed.process-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.process-2.1.2-OasY7D5KRAimcZcWcjFwgi8Qb4B-iff1FfrVeNSih6A.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-csv/1.8/37ca9a9aa2d4be2599e55506a6d3170dd7a3df4/commons-csv-1.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-csv-1.8-qL1WZS7UZo2dWjOZSuUvWbnjnI6w68tmhOaK7udXmmE.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/commons-compiler/3.0.11/f2a6ec7fbc929c9fc87ff8bb486c0574951c5b09/commons-compiler-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-compiler-3.0.11-DxpPXyZccBoxkzJErnBF_O8YtPpZUEF-Je5wvlDd2s8.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-calcite-1_20_0/0.1/6d16a59dc771784789116607a04acd9a0ef91d16/beam-vendor-calcite-1_20_0-0.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-calcite-1_20_0-0.1-1NrX_9FNKiEqNk5qBOaRlj-IwqOvKvQIGIbTVgm_v8Y.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4-runtime/4.7/30b13b7efc55b7feea667691509cf59902375001/antlr4-runtime-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-runtime-4.7-KmGUP4A7vR0OAt_9GbkqQY-DNAyZQ0aAnjtR4iMapsA.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/ST4/4.0.8/a1c55e974f8a94d78e2348fa6ff63f4fa1fae64/ST4-4.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/ST4-4.0.8-WMqrxAyfdLC1mT_YaOD2SlDAdZCU5qJRqq-tmO38ejs.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.mongodb/mongo-java-driver/3.9.1/d313237180bf9f2f82e12f503d9617e6b070f792/mongo-java-driver-3.9.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/mongo-java-driver-3.9.1-mxKxkvmYluxV-Hdn57uyt-MjjSQUsFjxFw9tjhx0bm4.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.ibm.icu/icu4j/58.2/db9fd4b4c189cf1518db14c67d14a2cfcfbe59f6/icu4j-58.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/icu4j-58.2-lT4eg7K-fD6i-I2obBNhT0fp5x01eMhSHX8Yd1a2OWI.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr-runtime/3.5.2/cd9cd41361c155f3af0f653009dcecb08d8b4afd/antlr-runtime-3.5.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr-runtime-3.5.2-zj_I7LEPOemjzdy7LONQ0nLZzT0LHhjm_nPDuTichzQ.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.glassfish/javax.json/1.0.4/3178f73569fd7a1e5ffc464e680f7a8cc784b85a/javax.json-1.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/javax.json-1.0.4-Dh3sQKHt6WWUElHtqWiu7gUsxPUDeLwxbMSOgVm9vrQ.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr4/4.7/cd6df46532bccabd8127c18c9ca5ef481962e931/antlr4-4.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/antlr4-4.7-eGclcCizNzrwEd7nts6bWHqP1cegsl9osv9MuQvoqgc.jar
    Sep 21, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.abego.treelayout/org.abego.treelayout.core/1.0.3/457216e8e6578099ae63667bb1e4439235892028/org.abego.treelayout.core-1.0.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/org.abego.treelayout.core-1.0.3--l4xOVw5wufUasoPgfcgYJMWB7L6Qb02A46yy2-5MyY.jar
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 179 files cached, 40 files newly uploaded in 1 seconds
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash f825ee5e5fff8e1a6ac38bbcfcf07b276114f4d108f77b3fbc79031c66a4ce1b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--CXuXl__jhpqw4u8_PB7J2EU9NEI93s_vHkDHGakzhs.pb
    Sep 21, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 21, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-20_23_45_20-13725072535501620475?project=apache-beam-testing
    Sep 21, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-20_23_45_20-13725072535501620475
    Sep 21, 2020 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-20_23_45_20-13725072535501620475
    Sep 21, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-21T06:45:20.769Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 21, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:29.445Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.074Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.111Z: Expanding GroupByKey operations into optimizable parts.
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.165Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.234Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.265Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.295Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.328Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.753Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:30.821Z: Starting 5 workers in us-central1-a...
    Sep 21, 2020 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:54.361Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 21, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:55.755Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 21, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:45:55.788Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 21, 2020 6:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:46:01.036Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 21, 2020 6:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:46:14.942Z: Workers have started successfully.
    Sep 21, 2020 6:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:46:14.973Z: Workers have started successfully.
    Sep 21, 2020 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:46:50.708Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:46:50.845Z: Cleaning up.
    Sep 21, 2020 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:46:50.941Z: Stopping worker pool...
    Sep 21, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:47:35.421Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 21, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T06:47:35.472Z: Worker pool stopped.
    Sep 21, 2020 6:47:42 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-20_23_45_20-13725072535501620475 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c9db7fa5-2ea8-4517-be9c-3aac3587a738 and timestamp: 2020-09-21T06:47:42.947000000Z:
                     Metric:                    Value:
                   read_time                    18.151
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 6:47:43 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 36.347 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 26s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/4hbqq4ub4sa4s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1020

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1020/display/redirect>

Changes:


------------------------------------------
[...truncated 269.77 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 21, 2020 12:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2020 12:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 21, 2020 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 21, 2020 12:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-a7tA5tBS8nZBQI45KsDmhne3vv8A4NCpmsuOZO1Bi4o.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-5mdUCx09gqjJtt3yKez6ViUWOIbYIeN8eJHXZECS95c.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-BH9hdVbnpJPb2JKSvlDCvIeBDAJfuECyFyU4Li6LjD8.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-EncFyGUZMIiWeZYHtfqlYxDC8fw4R34SiiyWp6HJLsc.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-qZq_30SZrL74Gdhb3cY5U8_ruWzCwczqnKYBcxMwcY8.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-0wi3HjsVbdqjjUp0HtCOrg_YkpwgX4S8cnqhMg-F9tw.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-e0QYuhzMplEOBImrq-dWwMf6v3CkktjnnEq6Wqnzv1U.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-n9bgR8SutXvTQZXjUVSq6oFM8bSJdD1jFcW4LFI5IW8.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-RFQQ8saCErSZYXVXcH8LKyHFEP5JEHhIlP1fbILBbcA.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-z8k0l-dROB984Kwl9ovRgFDKfMdGUBsmK7nrOApzOsA.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-WmkeN0RfUopmhBWSQuwYEALH1JqaqFfMEEijmFkfWaU.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-AypiAwdb7EQQXTtXLVcs28f0fc_5QUhp2_4qKITzC5c.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-S-905rNqei-x4tppXM6eRlymeRefUyMkQRjKPwDh8tc.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-3Zx5o9dpS-gA4jbQEqbnQQMw-y880-TYMW_SwkFV-ZE.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-erJmYA0BAjITk6NyitwJ-OY-iAdrt6ZGdjzE-vuVLJg.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-_U08HB7iF0ityRC0zJRVyCv9NjGqZEOucMs4ATU81XE.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-59DyPGl6YUU3O9mHFAEqOAOo_6EuyKm058QwirY7XkE.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Vav9AJIl5uFMHVbLhsYQcZAWlCYGCoVIeNsFriHMoCc.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-5aeyU6g0y5HRZrfeilWgDADHlVVu1TTdPXAjvDO0NJQ.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-AypiAwdb7EQQXTtXLVcs28f0fc_5QUhp2_4qKITzC5c.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-HxlwmndQhB_MOzAZoKeVHKNe0wO9AbrCMQsdrRii2aw.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-246bVRi_NecX7eOW2ruf-AwU0ixnjo6G_vslN4AhI3Q.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-wZ-bV7GD6ylN0eXewf9tyKyxyToUgNigNtBPK5n8P-Q.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8682017055224343196.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-IvcUvEP_mYsvVkJa2iYsMEycppTufZ94StSpYKjgvjg.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-IjKIBrKV4TsOaGSiY1c-56gOutcWEmpKAitk4iZ8JPY.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-0mTzrBtSI1TxW9FejwMze0zw5AxBucoWgsJth_OHQ5Q.jar
    Sep 21, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-e_gIaGGjPxBofLg9f_nU9j7MJXzFs8_ZmipQes0tjzM.jar
    Sep 21, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 21, 2020 12:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 21, 2020 12:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 21, 2020 12:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 21, 2020 12:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 21, 2020 12:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 21, 2020 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 8f1785b7f3606c256e3951360cf31a8987c140fedf4ba8c12b4474f002aabc9d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-jxeFt_NgbCVuOVE2DPMaiYfBQP7fS6jBK0R08AKqvJ0.pb
    Sep 21, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 21, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-20_17_45_29-3141953906935979055?project=apache-beam-testing
    Sep 21, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-20_17_45_29-3141953906935979055
    Sep 21, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-20_17_45_29-3141953906935979055
    Sep 21, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-21T00:45:29.091Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:38.685Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.191Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.231Z: Expanding GroupByKey operations into optimizable parts.
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.259Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.341Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.370Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.405Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.440Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.814Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:39.871Z: Starting 5 workers in us-central1-a...
    Sep 21, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:45:59.127Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 21, 2020 12:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:46:04.214Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 21, 2020 12:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:46:28.085Z: Workers have started successfully.
    Sep 21, 2020 12:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:46:28.120Z: Workers have started successfully.
    Sep 21, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:47:01.419Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 21, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:47:01.561Z: Cleaning up.
    Sep 21, 2020 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:47:01.653Z: Stopping worker pool...
    Sep 21, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:47:50.301Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 21, 2020 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-21T00:47:50.343Z: Worker pool stopped.
    Sep 21, 2020 12:47:58 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-20_17_45_29-3141953906935979055 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1e3c5d81-b8a2-4dba-ba1c-f82aee33a059 and timestamp: 2020-09-21T00:47:58.520000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.552

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2020 12:47:59 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 44.73 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 41s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/lft327mz76b7u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1019

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1019/display/redirect>

Changes:


------------------------------------------
[...truncated 271.00 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 20, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 20, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT--n8sI4CMsMUf6A9n8cMkp2ASrwwodjqtZoSaJbbAK10.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-1KuXi7U_u5coqigk9z9FUXDZgS7hH1xJ2NuKhF4He4Y.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-BCcgjZTYD9f_9NEg-Zhv1PzmzkCOn6CzGrgzjOuYkOI.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-JHov32yIzKSa0DLSuq4wSaXTckolp4YAEnql3IIaMkw.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-HwE0-0LAcY-PLKH2mCvl1M_SgU6ENnCL4K7LTTmDiHQ.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-WETmD6qwYNlFLm7rWknfML2KE0K9HUN3BXITg2yR0ow.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-kEEwbyak9Td2ergVHmtCEyi5vqM-ddjKqyg080CzJrs.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-K24HGqBq5KPIdGcLZpfL2l2-G-C2pAubTw5D4P9aKK8.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-AtkBWglXcMQJEKQvrk9TnotRZUvdjfmSOSM8xUO7xcU.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-XzBoiTaV3rnL-3pawLpf2V6OrNhVCUMPj7dwFDSLFeY.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-IrygagselvZf3LMgJgxCgNfVrE2ExqsDmW86Zgs0Yhw.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-vbGm4o1gQNwEqaaiZXVSImAgKigSVy-M7vx8mjfL8_s.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests--mjqPx2plUbnKSXemMZesM3LYw6t762r97eZW4lV_Xs.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-BVWIgdN7MHASeTh4nE1LTF-cBflGMHTYOdSvLbU_EJ0.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-dKgdqV3x1N1VLVJlRhin53Zl1Dt49YxaoxcSwwTTRQA.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-AaXwt1vL7SC5Y2zqaE5fLjwo-n1kxc0pJNenKxpL94Y.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-TUBbndA4C0MdUPqCKoKQ8TZ9xuo6opQnaM4Aw4zaVtM.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-oX5zE61xdpy71by5MrJN9T2gQQ93kJjMzHPDzIOTx-s.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-4a9_ITi6g87KQSxM7O0_o1QLe1j0GkAcgskiY9-21gc.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-_GPg_nrtCjkGgLSBgS8gyE5rCE77-txCoB3Z0iRgJ_k.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT--n8sI4CMsMUf6A9n8cMkp2ASrwwodjqtZoSaJbbAK10.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-OZ5yZJsiCN4uGtfbQmUVTCbLiFHa700aqyyR-LRkEgI.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-RBBXkpjZMu_XpMNe6-o4vgjpQ2RhwnywrbW2_RKmz9w.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-de2PBm7UkXuayuBEEzhcQVquuiXAXYdYUSyF5ZT2qsI.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test564696704809297892.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-TQ7z5CMWP7TR6p6QLSC8wI2BZfBNhrllOFoAhz6N3v8.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-tpwccbWLmPBtGSuJSA8i7Fw4GasiscB23RnYvnv-IkM.jar
    Sep 20, 2020 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-C0Kfl-CxBfU207cPIXmwn8SspOYuYY9_bNvoh9n3vDg.jar
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95444 bytes, hash 6803426bb3092946937c27c4b5c99ff4dff73ebd938c7ffd3915876f394f0585> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-aANCa7MJKUaTfCfEtcmf9N_3Pr2TjH_9ORWHbzlPBYU.pb
    Sep 20, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 20, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-20_11_45_22-2079414324754370787?project=apache-beam-testing
    Sep 20, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-20_11_45_22-2079414324754370787
    Sep 20, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-20_11_45_22-2079414324754370787
    Sep 20, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-20T18:45:22.811Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:30.667Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:31.714Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:31.776Z: Expanding GroupByKey operations into optimizable parts.
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:31.827Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:31.916Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:31.966Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:32.015Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 20, 2020 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:32.082Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 20, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:32.596Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:45:32.713Z: Starting 5 workers in us-central1-b...
    Sep 20, 2020 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:46:01.007Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 20, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:46:03.851Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 20, 2020 6:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:46:23.007Z: Workers have started successfully.
    Sep 20, 2020 6:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:46:23.065Z: Workers have started successfully.
    Sep 20, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:46:57.813Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:46:57.956Z: Cleaning up.
    Sep 20, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:46:58.029Z: Stopping worker pool...
    Sep 20, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:47:50.336Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 20, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T18:47:50.373Z: Worker pool stopped.
    Sep 20, 2020 6:48:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-20_11_45_22-2079414324754370787 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 73cdebb0-7a5b-4a7e-942d-33e50a300130 and timestamp: 2020-09-20T18:48:00.461000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.252

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 6:48:01 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 51.875 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 44s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/uopigm5rgwqmi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1018

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1018/display/redirect>

Changes:


------------------------------------------
[...truncated 269.69 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 20, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 20, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 20, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-oCLrbH5nRwgRbrTtg--V-Y6I-QarhPhUCuuyL4O8cas.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-c-p10mpnzzPSRfVgRPigZvnDvJIwiMRb8YrX545U4-k.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-H0D4FDVjFNz4k40zRqu9z8zGBozfeKVsq9gmxvd9bb0.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-6BhFjsq4mRiuRgWtUzOV2jyU83eU-A6ZJJcF6TSO-AU.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-pb_BdlWbxsa7I1SZyANaEwNZrtifNnZ4PLBCcPO86Sc.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8664879349511158062.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Xf9JW3JzuEC_7vqMytNO0KaHJGKM2vft_JdfNU4w9Do.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-4sN-Cr0XChZPG52O-eTrmFKIe-B6umLQwM6r5U08CCQ.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-fAClgjNIPA_9ftloUxDNH1nrqtD97iHLAr_QwngPqsI.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-aIBDItcYCdWvXXJVKgthkobWvpQEY9Pwxier3ndGUyI.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-Uqx5iGjlCf_HFo5pLS9KDIhCzlp6rj6PvhT-D7AiQf0.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-eGKspAM15TpOB434ZO52Q4mUoJ-kICezunG_3qDA9VY.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-cdD6DzWKBe3OFfhr-Cbi07h10XaGfmt15I7gpe9JEFY.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-eG4uKHFseINSjz-7elfwAdWSyqwLQ6Q_Ge3mXYEB1-c.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-0RTjyiAJ-jJ-a1-grAowxf4ALA5wkWJ9IwTu3EW2ZF0.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-rkp96itfJKuE9Ejj4N0DsyyXpkhHvSyqwt3bGi97HUQ.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-4RpPZW9nUvmI5UI-Z9gJt8xKyW1wfbZgSvj99bUok8o.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-k0623Ztsw68W2FlNjByjDmnKus14n-trryffhJSIAoo.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ZOeerKX0xcciiZPXqpb3HLBp77OmWK8vi6HElTw7Tps.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-oCLrbH5nRwgRbrTtg--V-Y6I-QarhPhUCuuyL4O8cas.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-xL8Pn7aGCGeF3C6ARM42GQPr3yCwBzRH_9zWJ2nu-PI.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-eTJT2SXCW0VF7Gy0lSGujhaC9He0YLeUxpa07QuMDHE.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-mGud3fCKhgK5U6tF6cFClm2im3wFm1WwljZTd7NRbjU.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-d_5-Fhq_UZbC713QgjCH3hYa76RSv0WGbZLYi8irhNg.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-K_WEgUqE9dyMZ4JqRErO-Y7G-gjtlHhfV9BlQKxwed4.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-w_03CJhuMQyr7eQi7hip72Ha1TT9w2vaCmRzE1h64Eo.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-d9-KI235l5EKOpt0LQgdmqwgW-pyZBANpDKnPV9Lxco.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-YgHvC7-qgFl7OOusM2G3DDND0NqRcPztFUYMBBj4ZAk.jar
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 20, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 20, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 20, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 20, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 20, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 20, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95448 bytes, hash 2974ecb3875e9e660e86f0b18c7b1a1e5ef7091a2fd02b89258867704f15089f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-KXTss4denmYOhvCxjHsaHl73CRov0CuJJYhncE8VCJ8.pb
    Sep 20, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 20, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-20_05_45_25-9017304659873718613?project=apache-beam-testing
    Sep 20, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-20_05_45_25-9017304659873718613
    Sep 20, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-20_05_45_25-9017304659873718613
    Sep 20, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-20T12:45:25.541Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:32.679Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.350Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.387Z: Expanding GroupByKey operations into optimizable parts.
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.424Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.505Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.538Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.575Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 20, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.599Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 20, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.911Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:33.993Z: Starting 5 workers in us-central1-a...
    Sep 20, 2020 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:47.014Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 20, 2020 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:45:58.302Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 20, 2020 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:46:18.636Z: Workers have started successfully.
    Sep 20, 2020 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:46:18.673Z: Workers have started successfully.
    Sep 20, 2020 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:46:53.348Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:46:53.527Z: Cleaning up.
    Sep 20, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:46:53.627Z: Stopping worker pool...
    Sep 20, 2020 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:47:46.339Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 20, 2020 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T12:47:46.385Z: Worker pool stopped.
    Sep 20, 2020 12:47:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-20_05_45_25-9017304659873718613 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 81589ba3-aa2b-4174-bb78-3d9eb97d05db and timestamp: 2020-09-20T12:47:55.822000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    17.469

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 12:47:56 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 45.58 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 40s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/yxtsclnm6nayy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1017

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1017/display/redirect>

Changes:


------------------------------------------
[...truncated 271.50 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 20, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 20, 2020 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 20, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-_nVyYi0r6LvvRIm_HbCASJVHmmVYXUSrxLMlXRQNRFY.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-dTlc_ZCSlaOfM5D9Uyn6-V6ko0nX9SVES3ttVg7ecrE.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT--LKb0kd-LTCHpNo2hlmJNGigLPcc1PJgtkg4rgeeass.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-5iSKK1gis2CSiGGbt0cRn4F32yiqd6HtUWnn7cB5BWM.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-L0Hg5Ck6vqugvY_EmRKR--jn8ZY1sv5y26QSfB20rWE.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-xcNdKVNkZKLT2cd5dwRQ6OmByOAJ2MXh-NZszVTITxM.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-EETnHVVu-qqwV84hwvbphUr8G8u78_HKPkzSoV_4IHI.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-JLd2AbkIS5t4rhSpra02dTVcLdwt8W-wB21gSa3JqxU.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-uOBQgzyjTVHejIslFOPyzEeirni5YO7D205rd4wrAdc.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-c6ATq0zlunG2DSA64wWwR1Zldl0ZemZr8BRSZdGzdSs.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test649361941710624551.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-yyukRWmkrbt-Whpzlry0AkXBPo978sdZHGA6dqAef_8.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-Jl31dYqWInTyYEa1gVe_gSQTtzF7nb4_P3ipJiFeRt4.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-naiOgCVdtEYBRX4z7RkG6QVCDApDsGc0EDEBaDzk2vw.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-JnmlzYXXYVvx0iQYX08oFctBaHnM-kHZCcaPdwpqQkA.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-kl3HhcIbzjLnrvuYpK5lnaESIBB8ngOw3kCJCZjktoo.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-GMzxYehMw9r1NDYjrz41JjiRrYyXgKYBcVNSxggN9UU.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-xcNdKVNkZKLT2cd5dwRQ6OmByOAJ2MXh-NZszVTITxM.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-DZO7qaztLcCeCUZ7oDld60ACvFwsLzhaAOJWJ5WMQnA.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-8Cswar7DMgTwFWX1o4Y-7VccsT9MNHezJ8HhTGAOGqI.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-I19VMW88TiwFKwK4otm2IfzZ2a8DEj4LguIkfFAN8QI.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-QM_swhBkpsu_wLDDwcvpJsmZ_1Kcoyfo6HVgMknc7kU.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-fpuZ35nYf_BDLD0TwuSqKnI9BhjMi_Z08GXToph6sJM.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-bUhkxlSwsWTK7HWwszjZ0bkZDQgUPj8iC9ULTY0iloQ.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ytdcoqJq3zk-zaao76UnfHS2Dcp09b1ZX4tR9YNZL_Y.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-rNng08-V7oAOeWUkdAQBWQjtvroR8U2W-nqrrfhg-18.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-SCo4lMaDGtLJjMlsjpg31DholgROkwdO0U2nRg3eGDk.jar
    Sep 20, 2020 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-8K_H-LorEvR0OEAJsKe8Qtx6b_qYtpEoYYTkM60SSSo.jar
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 1 seconds
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95444 bytes, hash d07e77d70f4c9d51ddc017e0b215f4629b447e8d770788d5c33c8d6ead3c7e79> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0H531w9MnVHdwBfgshX0YptEfo13B4jVwzyNbq08fnk.pb
    Sep 20, 2020 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 20, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-19_23_45_22-9055895011856506504?project=apache-beam-testing
    Sep 20, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-19_23_45_22-9055895011856506504
    Sep 20, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-19_23_45_22-9055895011856506504
    Sep 20, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-20T06:45:22.689Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 20, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:29.671Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 20, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:30.378Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 20, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:30.414Z: Expanding GroupByKey operations into optimizable parts.
    Sep 20, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:30.440Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 20, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:30.531Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 20, 2020 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:30.561Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 20, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:30.587Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 20, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:30.615Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 20, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:31.083Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:31.154Z: Starting 5 workers in us-central1-a...
    Sep 20, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:54.290Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 20, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:54.328Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 20, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:45:59.627Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 20, 2020 6:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:46:03.293Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 20, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:46:17.777Z: Workers have started successfully.
    Sep 20, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:46:17.810Z: Workers have started successfully.
    Sep 20, 2020 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:46:58.161Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:46:58.313Z: Cleaning up.
    Sep 20, 2020 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:46:58.387Z: Stopping worker pool...
    Sep 20, 2020 6:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:47:43.235Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 20, 2020 6:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T06:47:43.268Z: Worker pool stopped.
    Sep 20, 2020 6:47:51 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-19_23_45_22-9055895011856506504 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7f887d48-ea8e-4976-9a55-3aaf16a1a216 and timestamp: 2020-09-20T06:47:51.381000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    20.588

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 6:47:51 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 42.924 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/euxdobtjouy7g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1016

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1016/display/redirect?page=changes>

Changes:

[noreply] [BEAM-9547] Add indexing tests (#12856)


------------------------------------------
[...truncated 269.99 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2020 12:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 20, 2020 12:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 20, 2020 12:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-heHa_wLGFieenNyyBPn0dqVrzczMuMQnTz6bz2v6QcI.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-6ZdgnBTj2R5ld299c5TAZpO_yVNDhwFW25rYtx5Q-_Q.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-dFpA27fzYgAM7wl9c-900_oyisIvESzsJ-aML49v41k.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-CPJh1uxmRi7Vg1KaLIAOyUH5il7rHREZ-6bnClZ6TkA.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-XDFs1N4sJ-OaYqsknoQ_J5E9Amo8anHQ-AJNKgQEk1g.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-PuW-ve8BkebTWSHshFOloXdXLusaXi-fdWS16cyHsus.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-SXUAy2mXTIMNPylVnC6-Ao9sXsyJnnMuwp-tvN7_dN8.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-jdzH4ebyXHyvMQ2YzL_m_ezwLZI1_EPLeG1HBXYFb0E.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-JpCW4sclFBx2kyg21XfnckrLflFnbMTq5b16Na45-gg.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-er4PZM2swMchucoJe_QhH1CfCZS47PLLAsbrsezMMVM.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-GUqg28SYOxM6sj_53ss_pwkh06ptAlS8LJps2EQGBWg.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-mOViXUTQrZR5AwGy0MOg-GIXi9ziVwLYWjK-j5gYqws.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-heHa_wLGFieenNyyBPn0dqVrzczMuMQnTz6bz2v6QcI.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-qwT5lpbQbbk1dTXJ5Q4LMa5DsYEI3Qx2lQtfEP58OoM.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-hi_kDIVH5qYyPNk_aOz51BbHfyEuyB3DG7_9-il6aeY.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-yfZD2bWTOKoYRqQOIaEWI7wUI4Mta9vssLjgI6d6leg.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-_svO8kyrR-7C3UtCofrHoREdY63y4I3-n9saPYgmM3g.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-u3CMIR07uvdqsPbyx35zkz9duq-j5Yp4LCC2MBoTjMQ.jar
    Sep 20, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-9GfR3upmCSkiUjtMf2ls_lGj_Xq5yh1TP6l_-2nMuus.jar
    Sep 20, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-XilDjs-Joy4Hx7WH0NjVH5cpPgi2Ou95mXgtw6oyYSA.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6907139141169186156.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-6D7MfAnm8T-omOI8KkQe0bR-EZkiR_f4p8OI5zJLcXg.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-gQJrwCN_ykFiwWPHHEer2n_lpl25WWvSmRaqpY9473E.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-vI7j-zPHE0yTAoyNMbzvkhcw8WBsfNBiuYz5oTl3fuA.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-Z93aJK507GAEmZ-9TWP-uKZXDVmYs4w2pD41V-fPdbw.jar
    Sep 20, 2020 12:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-J__H8hAzdOFh0gTJAHdoEyn8T7onQlm_zLov7ysGmBw.jar
    Sep 20, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Ev3cdHQeQsYgNyE_nzHIciJbF9vJRc3bi8lcMD66fvs.jar
    Sep 20, 2020 12:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-0sd2Goxr50Wpy-moi1YGplfmrHIIY_Pq0YDeIw4DEKM.jar
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 2 seconds
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 012f809d149121065e24c35783cadebff0af28d74e99c923899a9dd2e00ae807> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-AS-AnRSRIQZeJMNXg8rev_CvKNdOmckjiZqd0uAK6Ac.pb
    Sep 20, 2020 12:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 20, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-19_17_45_27-13911827363939590386?project=apache-beam-testing
    Sep 20, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-19_17_45_27-13911827363939590386
    Sep 20, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-19_17_45_27-13911827363939590386
    Sep 20, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-20T00:45:27.979Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:35.925Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:36.690Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:36.724Z: Expanding GroupByKey operations into optimizable parts.
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:36.753Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:36.819Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:36.845Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:36.891Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:36.928Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 20, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:37.553Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:37.635Z: Starting 5 workers in us-central1-a...
    Sep 20, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:45:51.301Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 20, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:46:04.697Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 20, 2020 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:46:04.723Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 20, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:46:34.966Z: Workers have started successfully.
    Sep 20, 2020 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:46:34.994Z: Workers have started successfully.
    Sep 20, 2020 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:46:52.010Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 20, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:47:09.772Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 20, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:47:09.940Z: Cleaning up.
    Sep 20, 2020 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:47:10.004Z: Stopping worker pool...
    Sep 20, 2020 12:47:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:47:58.203Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 20, 2020 12:47:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-20T00:47:58.243Z: Worker pool stopped.
    Sep 20, 2020 12:48:07 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-19_17_45_27-13911827363939590386 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 15558596-0991-4951-b62b-38af2044a262 and timestamp: 2020-09-20T00:48:07.831000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.535

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2020 12:48:08 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 56.147 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 50s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/lm4bri3epqzri

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1015

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1015/display/redirect>

Changes:


------------------------------------------
[...truncated 269.25 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 19, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 19, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 19, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ulMcp4VJYsJLN_rJ0toODLxn648w1a-pryC-syrNGD0.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-b0Ze6bNw_0rpqEI0E7I8O-lV7DF9DUhZBY8h4kxNBzA.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-F6JYJ_1Ka5TZKj5AK2RHoWUqSB5BuXEQae9gFtrZmxM.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-OalURPzLnuXJxlIRuGGWknoKQaBmljykW0WJahutqsM.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Uwqc0OQ7LdJQ0FjBVMSa0q62jIUd3qSAkaacLba1t3g.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-hH2ZStc56Lp3rHPS04lgStIhWRm_u1rzHJI-dnQ61DQ.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-xevc66nHLjvYzHOoB_jJUSZso7fqI0wYgDzj1Phz-o0.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-XcclZuxmBHdXtcejM1L5YcCu1cKogWjK11dCKKDKrHA.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-xJo4mb3wVCZl3vHmqX93yLBZXBQ4yxyC6yLE7CYFEfA.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-tj82YBswWvUaf58s5og2rG0IDNsL7gKzwv-JEKU1g9w.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7994822536720672114.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-DS_9Hz0j9I2QbPIJ3NpFUBq2MjUlzAUMwP-uitvZsnw.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-T5azQ29BeJAiQZEWEKghceH3w0_wA5fnNqjv62yyx-E.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-lcRIwS4sTfr54vW9V5p3FTdv5V-BanWZmkJUTAYvUcU.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-GL8bH8rCvQsZKz2j3Ur11T_PdDHoLSpLpWG_xMLglXw.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ydzKb0sRm1qoHV9BhXIAQ-KSO_VbQy2V4QNfDkKss0w.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-6u0DGnagUPpSEO2qwPgIpzX_pGLH7jV9koA_MNb_jwE.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-M9AAXvlpRbyIEPDuTH0f-sXKSKTCBsd_WM7kd8TP_ms.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ulMcp4VJYsJLN_rJ0toODLxn648w1a-pryC-syrNGD0.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-oi_uBYBoXj3d_H4Qsddsqjz1zf0D8t6udSVqZ8bx2Ic.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-0CJ7iVO-Zay-MDnYzp3I1NXxv223jD7QrV-RX0G3cd4.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-q5I6-ANGI22R42wYJs4-0w97nvNpolx_wxqY50B7oyY.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-4x9oN1bT_hrnosinwvByxh5XD4EPknGKnzn_EfW93eA.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-U7z5p0UqBoyay0--djLCQCi0HkK6z3eA57muIwNuM00.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-oBs8VbCYdELfzZkMVamv93wlWMx9D2Nqhpllz49gQVY.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-yBBz7kB10Nk2K9qi0Ff_l4bnIiOGuTMXLnu05l34K_0.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-1JCrGyNlUG6NnTsVgn5Brp7X6-eUcbO1D0EADBPnNgc.jar
    Sep 19, 2020 6:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-FTvmZowqazh89eK5i57ilLmRjQYmGb7hkC0JHcONAzc.jar
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash da6bafbebc59823aabde6a862b85e43de66d3d56be5c5a35dc70edb92482f004> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2muvvrxZgjqr3mqGK4XkPeZtPVa-XFo13HDtuSSC8AQ.pb
    Sep 19, 2020 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 19, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-19_11_45_16-7783838822090100067?project=apache-beam-testing
    Sep 19, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-19_11_45_16-7783838822090100067
    Sep 19, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-19_11_45_16-7783838822090100067
    Sep 19, 2020 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-19T18:45:16.764Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 19, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:25.241Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.268Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.307Z: Expanding GroupByKey operations into optimizable parts.
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.338Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.411Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.443Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.480Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.512Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.865Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:26.937Z: Starting 5 workers in us-central1-b...
    Sep 19, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:56.038Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 19, 2020 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:45:56.243Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 19, 2020 6:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:46:17.410Z: Workers have started successfully.
    Sep 19, 2020 6:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:46:17.439Z: Workers have started successfully.
    Sep 19, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:46:50.222Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:46:50.350Z: Cleaning up.
    Sep 19, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:46:50.423Z: Stopping worker pool...
    Sep 19, 2020 6:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:47:35.822Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 19, 2020 6:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T18:47:35.872Z: Worker pool stopped.
    Sep 19, 2020 6:47:43 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-19_11_45_16-7783838822090100067 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 91a361e0-2e01-4ded-8fb1-4ad8dc6eadc3 and timestamp: 2020-09-19T18:47:43.196000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.249

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 6:47:43 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 39.221 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 28s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/22m3qu3ugomvo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1014

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1014/display/redirect>

Changes:


------------------------------------------
[...truncated 270.50 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 19, 2020 12:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 19, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-FbqPuZOxAoKykOqkNfSVIOrtfDxetX1uu6w0WxHYXkU.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-IpFTqO9Xdj96G8FDz2KIN7xWi2gZCxzxSKUuKDTlXu4.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-CBqr22mazMbS8M_kD2Px0gZmSwBasUDvZ2bQxEtLux0.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-oQJar59lhPJHWl1M6WxZTVE7LZp5gsVDtUBjylJgQpw.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-dMaDAEya6kd3r-XprMPtEnOBhQAsUOMgxtU6oafcWo4.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-IwK3prPQBJ504mgr2BiNKGZYlQ6vLDOv9vhw80aYT4w.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-ZfxwkaIDdisYUy4oqL18GJolHOiW2Pg6rcGUWftHO8E.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-7T9v_KRcmLmtw78ZhfHpMltI6_B9J4WzDxlpjlBV1Zc.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-QP94JTwSWtkaU5-abJ5Y205xaaodqk69fetrlOIoH3U.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-FbqPuZOxAoKykOqkNfSVIOrtfDxetX1uu6w0WxHYXkU.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-A83P9woBJaOjvDucrHzZGX1AbYkIfmNftEEI_Vdvqco.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-KvUGAJqtWE_PLiDfZ2j4VQUJqvcRDYZkDogWuJ36fZc.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-dhQUgSyEzYxU3X7wOILEy-9S5-C5m0mU5SldiTw95jA.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-su2Y2wEVq6yt0KAxWpQsMK6Mb38_fzi9AL6BMUI7GLc.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-iQkNnEmTMth00UkPcviISvC_zJGG6lVW_flPASSvnRM.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4650752779039135780.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-WdZNlMhyD-g7yE7HO3cLtlW1Yr8QoBEW5os3c4xtqVo.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-ViRxBNOYhGMCJ_3sbB43qT1oJ-6iHcyeVseKCcaqgyc.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-WbrPK1S-QSsPMzzLID8iFDYQkkS49enizHanoUfVTAo.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-UeLokLtxYLUOIizJ6mfcW_iiqomb4zKMlkCKzYG5CHk.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-VxKPG6Ov8J5BqxktiMV80hzYvYwvC5RNdhHv926XZps.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-YpmUARdmHUH0-VV-HWgEO5srrcbqpjXMUjCxoJioM1Q.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-GlWaxSJDMlNC6RtIaZUUy3aJjzSeEYBl27y6WO5LkmM.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-o8JjFyaMoYStElZbubEoXx5fIxwuKU8jwYRQl3Bwm34.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-cSmJU8aStObxLMJ_egvBd3atRXOz9kZ_TWulPEOo-R4.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-MwGsT2ySd6PoQy7cFvHwcS-UxypbhG6BM0do9vQCwt0.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-xUohqBrrqfNX5TMAd0yJi5wE7Jl8hrTjsLawIwj03Ko.jar
    Sep 19, 2020 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-Cj0914gSKYo6eLZLeKeQqzhGJuo4pELRR3V7ZyNBkts.jar
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash ec265b64f36a0c6ed5ca125635123c6dd3b4728a116eba5a079ffbce80cb24ba> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7CZbZPNqDG7VyhJWNRI8bdO0cooRbrpaB5_7zoDLJLo.pb
    Sep 19, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 19, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-19_05_45_16-877911829046410927?project=apache-beam-testing
    Sep 19, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-19_05_45_16-877911829046410927
    Sep 19, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-19_05_45_16-877911829046410927
    Sep 19, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-19T12:45:16.908Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 19, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:23.882Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:24.694Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:24.723Z: Expanding GroupByKey operations into optimizable parts.
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:24.764Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:24.845Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:24.875Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:24.898Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:24.933Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:25.403Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:25.503Z: Starting 5 workers in us-central1-a...
    Sep 19, 2020 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:45:51.948Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 19, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:46:02.969Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 19, 2020 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:46:03.003Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 19, 2020 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:46:08.308Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 19, 2020 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:46:33.318Z: Workers have started successfully.
    Sep 19, 2020 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:46:33.357Z: Workers have started successfully.
    Sep 19, 2020 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:47:10.927Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:47:11.071Z: Cleaning up.
    Sep 19, 2020 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:47:11.148Z: Stopping worker pool...
    Sep 19, 2020 12:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:47:55.083Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 19, 2020 12:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T12:47:55.121Z: Worker pool stopped.
    Sep 19, 2020 12:48:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-19_05_45_16-877911829046410927 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3f9102e0-ff57-4d11-b560-ac62857fe245 and timestamp: 2020-09-19T12:48:06.411000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    19.277

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 12:48:06 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 3 mins 1.968 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 50s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/ypxm5jgopxizm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1013

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1013/display/redirect?page=changes>

Changes:

[noreply] [BEAM-9547] Add not_implemented_ok (#12857)

[noreply] Allow pandas 1.x. (#12869)


------------------------------------------
[...truncated 273.33 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 6:45:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 19, 2020 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 19, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-UeVJtPNrxsbKQGUkv4ki_Uls_J1xxy1gEH0zprcdoG0.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-XNZ_Jy6z6rRWK29lF2sAL4o9fCdclzDgFKhtEEnCOVo.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-jy5ceOPPRg1aC2qIsyTf1k_v-JBROaE9W6wAKt2ANI0.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-rWQlWrHPijCQaXYpNWtJB9_Id6JP1-R8u0LeTgKj6ls.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Rjn72h7fan1NKEeu6MUYYn1I3qFX1djstZHn5DXnh4Q.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-UeVJtPNrxsbKQGUkv4ki_Uls_J1xxy1gEH0zprcdoG0.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-neb_gCfW7pIYMvrs8375_tdfiL-5Lcj0QkXVZrX02Is.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-jqwoW_hBDgRZziFsr66AUphNkJM6mIguNHFOaty-CGA.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-3JAZzmvEHt4yTZ2cutifHc3Aj5xuy1pAZKbUoH5cRXc.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-OUSpK5msczc1pFnFNo0BpVZr4JtwuvTTQD0goSCH2-c.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-u4-m_y5gz1W4WEZCLL_yWCmT9xJREoiWigB6IXuD_YE.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-PD80N8wLOcTH5OfDGiVk7eBsYzTD48PY1uUtkxiGjuo.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-nzlQ23BTVemPmclO9wPKfynisEBig1PVWGjszEAriWU.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-G15kjGYbxRaRkEXbD-G1j-lvpy5oz0XFpgaKNJS_lmI.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2674166951773094783.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-TyvQSUjZC6sILJyjKUwLPhVLIRmk4tUvrxpuSsuLdts.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-gk3OZn7MEaG5K0xV0Mvv_mwyDYkDbnSMo10Xa3sg9Zo.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-PsJgrRHNsvfcBbZkK34kWZtZdWo_snN7dSfytwBhod0.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-EZUvOq3mVYdAdrNHPpj1EkQisQT6VNXeu71MrvqVAzQ.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-KVT7I8WUJv9ttb1NPEioQUhMIsGPRREFLEnX4QfnkqI.jar
    Sep 19, 2020 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-o5DwlcBNx-6bByY-WfNUT89iwAEl2Q8JvqiK9AUUnCE.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-8qn_o8fVUdLcOA1bl19woXpfum9EqUsnnIreym422Ck.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-eM8pyd96ZM6IH10zISLCk4e0ZCfPiJzu1csD5DWgfmA.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-AMFvbu1FfRYF7_rMOiWWE19XFsXc-N1x8bGmSy6K_Lo.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-QtpqsIAEvmLDT9h51CFs-u8sk4DUzTgEPl6KEuzksfc.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-u2IGSduaGZy2VHdqW5-URFWvxitZ8rYdxFpBfqJi23M.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-jPHxUAL46QG3VrdhjGXpK7uJcrQ0uVopB39WVCV-9_E.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-VH8tTYBfvfPLefjKZZSbfrNo14zk3aWQPXUPflTxLP8.jar
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 1 seconds
    Sep 19, 2020 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 8d103c133cde3bcb968d987301ba462978c5d484a4b63e46a7bba6731f08dd0f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-jRA8EzzeO8uWjZhzAbpGKXjF1ISktj5Gp7umcx8I3Q8.pb
    Sep 19, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-18_23_45_39-345060914686757828?project=apache-beam-testing
    Sep 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-18_23_45_39-345060914686757828
    Sep 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-18_23_45_39-345060914686757828
    Sep 19, 2020 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-19T06:45:39.411Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 19, 2020 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:50.536Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:51.485Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:51.682Z: Expanding GroupByKey operations into optimizable parts.
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:51.713Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:51.772Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:51.813Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:51.845Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:51.881Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:52.312Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:45:52.393Z: Starting 5 workers in us-central1-a...
    Sep 19, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:46:12.621Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 19, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:46:16.661Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 19, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:46:40.107Z: Workers have started successfully.
    Sep 19, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:46:40.137Z: Workers have started successfully.
    Sep 19, 2020 6:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:47:11.618Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 6:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:47:11.790Z: Cleaning up.
    Sep 19, 2020 6:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:47:11.862Z: Stopping worker pool...
    Sep 19, 2020 6:47:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:47:57.125Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 19, 2020 6:47:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T06:47:57.170Z: Worker pool stopped.
    Sep 19, 2020 6:48:05 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-18_23_45_39-345060914686757828 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): be47802c-f459-43f4-b43b-8f49178918b1 and timestamp: 2020-09-19T06:48:05.415000000Z:
                     Metric:                    Value:
                   read_time                    11.744
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 6:48:05 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 42.104 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 49s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/xr3e3qlpolfs2

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1012

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1012/display/redirect?page=changes>

Changes:

[daniel.o.programmer] Update Beam website to release 2.24.0.

[daniel.o.programmer] Update date for Beam release.

[daniel.o.programmer] Update date again.

[noreply] Add a blog post for Apache Beam 2.24.0. (#12745)

[noreply] [BEAM-10894] Basic CSV reading and writing. (#12841)

[noreply] [BEAM-7372] Remove Python 2 testing. (#12872)


------------------------------------------
[...truncated 274.68 KB...]
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 19, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 19, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-2Yi9CGTnn7Y_VUbe5ujHIJNoBLfXLlpZwXBrB3dIU2c.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-zhg1i-JbQ2GW3c4amKLsNaEh9ICmntlwKaFPjWlkg54.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-4lLaBNq1x7zWEh2SjZ5iQ7cSQKDVULBW7TJqdcwyJbg.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-EgtiGvh6ISwIyGku0q5SFs5kx4fckFQtKIjV8f0cTWc.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Vt4wwtIy9OtDHo9R808uNGEEkOE4hfePG4P-KiyhM9M.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-DmrvdraqL-kgPe3xUWg7BZZS_xyWWLMOVVfQzTegSms.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-phtYkjMdY2wcwoaXCPlpVo7HbHSoHVTcP0AWOEBK9L0.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-ygfNfKcijRZdgYAIFPFO33IltmQTpHkD3nCjEv2tNas.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3962022023690903602.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2EgzuFW5ynn0nKAAOpBZmc4zKbZ0CigOa-dy6qnvPhU.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-Lf9STY8Xb8z4N223vaJFgKjEIcWpEqVtW1h4blBGfMw.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-PLYCCHappCf15JBpjkkh_zuALFvzca67FZtoUcVPAy4.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-17x6j8Ehj5tMrg5cV7LlAaxJTJm78K-c3UDtHYnTCG4.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-oPT_GgzzBxyheMW6RyiNP63QxsE7UKTi47C7z1jNRHA.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-kCIOZdDacXIvc-V2ywNCnpWYvPHCG3Q_vfhwZyTfQes.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-K4J-zZ58Pu9ohIINxbEX8Mk2PvKBeCQsgW34Is74Uso.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-gZ6OtLHkOOXXW3WA8kVFT7lwodNHdqy-2BZMIOCr6ns.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-fGBtxld6nsjaWz7GIsjq0kIXB28_c4D9l-N_qRCW63I.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-2Yi9CGTnn7Y_VUbe5ujHIJNoBLfXLlpZwXBrB3dIU2c.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-SVgiVbjLdWYB7oAF9IKx5RPAibeaKeOTHXoMfmN4xY0.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-p186Exh2vfrrSjPEA-BfGzUyDGnBd6zi3QWhQdILGfk.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-fVWzQh-z9vCa6AzULN4AtB1yLQBHtreStpDLYleeKOI.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-M2SUQE2FAJYDHlpm-dP9z3zfk-LFLigOyFFSRDrkENM.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-udd4eno328metyn7efY_Kfxh8iSV7fL4JX-6RMLqtTE.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Ze7ptGbDrRkkVIF-uIoAUei-mBmTfFPIF2UDmfFy8Ds.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-rjXA9jrtIqCCIOkx4r-7uc2RUf9wsOfFewiGsWJOYFQ.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-wmbVXMt8W0ebvXlQp-uQgcc2DXnrN41xt-QO8fefNTM.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-VFRvQJdGG6vbxVCXINpMLolDBh9keJ3YENJcbVuK85U.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-7bSPUjllF5n2gHKnI6ADsGbhvpGHcqw5Bp4cQZ9ORA0.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-UUlRbYr6AtXw7HCe8geMau2TnmGesm9AfaPbtsMvTFc.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-7ec1aqVm7Y1m5LIAyivQ01102hJp8AI5e2oMTl7OGxE.jar
    Sep 19, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-9JDSEEISfD3U-RYLxQVez-9-4hDsMqYdltogCFmbER0.jar
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95449 bytes, hash 9737a2face4a02ccb4d213d7d177e37c96dcfb17820c59430e74bd9d3bc44dbc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-lzei-s5KAsy00hPX0XfjfJbc-xeCDFlDDnS9nTvETbw.pb
    Sep 19, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 19, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-18_17_45_40-1219052452420873710?project=apache-beam-testing
    Sep 19, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-18_17_45_40-1219052452420873710
    Sep 19, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-18_17_45_40-1219052452420873710
    Sep 19, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-19T00:45:40.797Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:50.369Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:51.083Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:51.317Z: Expanding GroupByKey operations into optimizable parts.
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:51.359Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:51.432Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:51.466Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:51.494Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:51.531Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:52.379Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:45:52.452Z: Starting 5 workers in us-central1-a...
    Sep 19, 2020 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:46:18.527Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 19, 2020 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:46:18.572Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 19, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:46:23.862Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 19, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:46:23.901Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 19, 2020 12:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:46:24.385Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 19, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:46:44.669Z: Workers have started successfully.
    Sep 19, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:46:44.724Z: Workers have started successfully.
    Sep 19, 2020 12:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:47:16.545Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 19, 2020 12:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:47:16.727Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 19, 2020 12:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:47:16.901Z: Cleaning up.
    Sep 19, 2020 12:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:47:16.995Z: Stopping worker pool...
    Sep 19, 2020 12:50:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:50:05.534Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 19, 2020 12:50:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-19T00:50:05.582Z: Worker pool stopped.
    Sep 19, 2020 12:50:14 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-18_17_45_40-1219052452420873710 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d63c45d2-48b9-48c9-b38e-d5a2a62773ab and timestamp: 2020-09-19T00:50:14.312000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    11.681

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2020 12:50:14 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 46.939 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 58s
107 actionable tasks: 64 executed, 43 from cache

Publishing build scan...
https://gradle.com/s/xrvlh5rpmohte

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1011

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1011/display/redirect>

Changes:


------------------------------------------
[...truncated 275.36 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 18, 2020 6:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 18, 2020 6:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-QQFd0XqbC9Ls5yES1miCzqv_3C0zwbe7E5NU9bNCiTI.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-ZfFy62PgFZG-ztZ4Vzk3foXcssa7hXwVYh0ph3rfMys.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-4KnOCdJu7qrCnFkEHTsuBYDQkF9ZNA6LWi7psdTUJ94.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-bTFrHSEaDnHnIxdlRjHhitbBNqQ5nFOnbsM6gd7SzWc.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-xIK_KsMzSAlcvFQv8cN19f4VwawAZhLhD3tMh6RBaRw.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-QQFd0XqbC9Ls5yES1miCzqv_3C0zwbe7E5NU9bNCiTI.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-YA-aN3bMfpmqzpYXH1fU7XWP8q5UJ1QfNJtJNlMMwag.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-iPsJgme7LTMcriBOqdsrVu3L4Eej1ukG3P6oDiW9FoM.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4249448483059047495.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qUDM2m6Y9rDRR5SQcWDwAJ1L8KnTBguYzXkz_baxsP0.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-jvvXH_LHbRPVFMrQ4VOcSU5lXZwOAZ1JG5IarSh2mo8.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-riVQB_itpOaW57K2VtN4vwXEqhHamoV4hNXK6WAo9Xo.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-2Gz27CKWhGGEguqLMe6TTszVYh03CCarFi_Aw6qHKFA.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-VGZZiEJsU2qCJ2OUwN_teyd8duITC9ZFMfx8y6d27F0.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-iY-ABmw_FbNGE3vQRbueokl6ubGitcRHYn2l9_JDpLA.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-CDnYvU8vHgdil7e-vFXHhPgvro2wqcA7Q2Drgu8tnlE.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-RJIXvvjvTOGVwsc3YaaVYldm1q-cAYy2m36WiWE9ISA.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-uzrIsR62vMJU-6H6ipgEnLWNExQXBq2DPwK0WOnsKEo.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-S872Qhlc-BykjXIu9XYMUD9ic-r29Mrb20GZNg_Fmw0.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-WjZ9n4Xm1HrLEsyFvcKKmBW0Gg4HCOEeMoc526Yg2qE.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-MMRQoq23Cb3_-yodE3Gt4SneLUXWe0iQnQKrx2KUbMY.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-Z3ccmo8xsdFLugS1cCBs1-inSC9PDTmGqAgIetZNwHo.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-DoargQ1S7PNijbwuKQUn2V8xh2KkKZJuSP3HFtoTOKU.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-uZyYFHQs2SdnEIOrwjOVbCNWpnjUeOLYJ6p_TWFNoW0.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-ojfwDcTVpVt5ShdvxqO8XJRy6SChMN_TsghwfOcaVqQ.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-5EiWONc7kIYH8-Az3JlMwUQYk8a734Pwbj9ShjaIC2w.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-UZnMcpNvXV1KOUomDeTRfr0F4LsqqHHAJ_xZgDdOB40.jar
    Sep 18, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-cqnuw5M_kSjaN2rX9WkEWY8pwl62m-_xI5Kb6Oz4eE0.jar
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash eeb9e3a9200f1004be402737dc429a3332e38aa447ebbacac5bb7ad47e9018a2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7rnjqSAPEAS-QCc33EKaMzLjiqRH67rKxbt61H6QGKI.pb
    Sep 18, 2020 6:45:45 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 18, 2020 6:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-18_11_45_45-13064140472509915854?project=apache-beam-testing
    Sep 18, 2020 6:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-18_11_45_45-13064140472509915854
    Sep 18, 2020 6:45:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-18_11_45_45-13064140472509915854
    Sep 18, 2020 6:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-18T18:45:45.953Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:53.877Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:54.733Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:54.774Z: Expanding GroupByKey operations into optimizable parts.
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:54.816Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:54.881Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:54.908Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:54.943Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:54.978Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:55.312Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:45:55.395Z: Starting 5 workers in us-central1-b...
    Sep 18, 2020 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:46:05.402Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 18, 2020 6:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:46:23.630Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 18, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:46:43.419Z: Workers have started successfully.
    Sep 18, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:46:43.469Z: Workers have started successfully.
    Sep 18, 2020 6:47:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:47:14.203Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 6:47:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:47:14.361Z: Cleaning up.
    Sep 18, 2020 6:47:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:47:14.445Z: Stopping worker pool...
    Sep 18, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:48:06.340Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 18, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T18:48:06.594Z: Worker pool stopped.
    Sep 18, 2020 6:48:23 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-18_11_45_45-13064140472509915854 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9ed7cc16-ff7e-448a-ab11-5b3577d519e6 and timestamp: 2020-09-18T18:48:23.976000000Z:
                     Metric:                    Value:
                   read_time                    13.335
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 6:48:24 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 51.908 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 7s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/a2qzwhedcpalg

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1010

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1010/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10643] Major rearrangement of Nexmark dashboards (#12863)


------------------------------------------
[...truncated 272.86 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 18, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 18, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-AMiRSEmMxzNjmeqB_yOF3N0ef3CbZkTF6sBfVq9ygTs.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-BO_WC9p5L4dZqsJE3lBBlhBwgz4GzDpo-cPQ7V2_MOQ.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-UErWmYbcaJp4nP6OwjtF0NMAywPsgljm6roqA65x7lQ.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-bBVPp6w6McbSHbeeWWnhFmftY2qWzHzqskQ-6kDYnJk.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-Yg2FmC5lyEyHE-4w-kiZy8UaoiB2jT9yougfEOPvOPo.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-GVxBBlWFCoQJ8ULkXRGh8JVmgAFtQPE0z3PJEkxV2jw.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-LbIIH8WE67uCdqj0PeaWVifxdUrrUuFtapqxCbafJI8.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-8pg9ZYUHyFl4aeOaNQaCb4Itg8OJ9DwNVdJ13A0M5eM.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-XCPrgjUZ4ogxXxmpOi0b8RMhOpj67iBVWecXDwL58dA.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-8CmpnEKH83MFyZGOUV3rh_tX5nDRrfavJZssCthhFxo.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-cWpxbUCiZt_LrFwcAktwV0WCTDTyfa1G20YlZc0ZS7k.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-95RwWBSZgKf4qLG59Kszq-0YGyB9qKLkX2pAg0TNa1s.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-hYSbfo3pnk077PsdUxaxJOVRgoMkR3-DvXyOvmneNQA.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-rU2_fCxYvlkfEBIvJkwlSJLQUAEeBIHkcwircVN5htE.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-hGFgEDPO0wGGL7tlNAwbeViO9Ma-Yw1vibylGBpyZos.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-LxPqVH6ftX3slPr2A5a5Ffynf47fcDDNtfl9slTGyiI.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-38xMweGTtH20nU3zRWtkJ4t8k5KXR3VkER7bZEU2xFE.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-aUveCFTeJuZ1qs_wMcVocbL5VcE6IWL8fMv-AcpXWbU.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-TjvulBBT76G9tCj920LXnyU7Z4HUsUZUgwfqGMnLN6o.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-AMiRSEmMxzNjmeqB_yOF3N0ef3CbZkTF6sBfVq9ygTs.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-uhXPjygq-AbBDI2swP6yyaxN2_Ht_5vdk6mXb7Jv2Wk.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-kaJYFo4XWcZigzwuvTG3oP3eHTr-ZS2966TsfYzsKKQ.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-vC4Ham4L3XTD-lnpcIRGloYJ7AvidKmq4mtO-CWyoGk.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-KM4VFkSQffT8EZgskVZq-TQrmS5ywvlrxasmAYTwvg8.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test63759738527024413.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-785iSX3wunNCZ36Jl9hgI40frI_HptkbqkFUxP4_BKs.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-RnJYGv_YbmFO_N3GkXd2nzGUd4nF5uAW1FbfkqL2qS0.jar
    Sep 18, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-CCkW5EPbgAp4Fnhbd_pa817Eld5hGHrlMYbdC92Sibw.jar
    Sep 18, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 18, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 18, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 18, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 18, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 18, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 18, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95443 bytes, hash 5eefa4ca53de8963d6348bd097829546a01f277f7d351f84b1f1581414ac10e2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Xu-kylPeiWPWNIvQl4KVRqAfJ399NR-EsfFYFBSsEOI.pb
    Sep 18, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 18, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-18_05_45_25-1696692630971199277?project=apache-beam-testing
    Sep 18, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-18_05_45_25-1696692630971199277
    Sep 18, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-18_05_45_25-1696692630971199277
    Sep 18, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-18T12:45:25.344Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:34.885Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.101Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.139Z: Expanding GroupByKey operations into optimizable parts.
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.168Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.249Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.278Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.312Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 18, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.347Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 18, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.690Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:36.775Z: Starting 5 workers in us-central1-b...
    Sep 18, 2020 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:45:56.153Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 18, 2020 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:46:05.373Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 18, 2020 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:46:28.221Z: Workers have started successfully.
    Sep 18, 2020 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:46:28.253Z: Workers have started successfully.
    Sep 18, 2020 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:46:57.000Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:46:57.157Z: Cleaning up.
    Sep 18, 2020 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:46:57.220Z: Stopping worker pool...
    Sep 18, 2020 12:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:47:51.425Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 18, 2020 12:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T12:47:51.466Z: Worker pool stopped.
    Sep 18, 2020 12:48:05 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-18_05_45_25-1696692630971199277 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0dae0184-da99-40e0-a847-e9ea84891107 and timestamp: 2020-09-18T12:48:05.353000000Z:
                     Metric:                    Value:
                   read_time                    12.805
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 12:48:05 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 55.902 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 49s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/zi2mackkcaumy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1009

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1009/display/redirect>

Changes:


------------------------------------------
[...truncated 271.11 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 18, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 18, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 18, 2020 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-S-TcMSkp7XzlimFnooGTslS3MIa8xZ4pqDmYJKUUJvU.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-Jd0pIhk3KuBZe8d_28SppClPMghB54vZ3aULIOTt_N8.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-TrQIOw7zmGysyIVderOm3j7s0sFfOHlsuLklY3XpWzM.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-RpKzYtNFlwUOcZO7klHjUMV7LhUlVqXqbvhGf23aX8A.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-xoxmgZmuCNVXq-n9D5u6r9285_CroDNys4CznogZPnU.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Z4fJZwg2qQT0848AEmJKqVJnkomdIHmFAc8lNquLhBE.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-272pZrh7Vp7DHnucE2n_mqa4cJ5kV7N6IVijLxSyyuE.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-uySmmi4tT1GwgLuZMxxt2-zBSJ9ZB6INzhG8mU9_2R4.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-ScoYPYTJnIpPmc9s5WeWLX8EU4INbHZRQlHXbOHSj1M.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-jj9ySUXlTLWjCs__CWk7ZcGcp4VtotFiweXRagDgrzI.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-jvTk_yRge5kbR2IX-4W-tOmCcbjgnQL_Q8REfThNWno.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-kctoKjiTjxF005nQse64ZOzRo6_D-OzcdA_hXvlQSvc.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3099444028583100172.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Ea3JIODxhwJlJo00_Gt1BmGMG4DlsudSG4Afqu1dfy4.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-YSs2iYaTRYW3SnsD1Crs473fTSgQg3d8vpZ8kbAnqaY.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-rGtVlYbz5kYgL8FvEeo-pAmiq9YqE3kGWsuiDD0xb64.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-DZcjUZB62q0ojM0Dg93Kc5-6rVcwWEamihL-nISzqkk.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-VYq3UEncn4mynf5aP-EjSwgSe8bQa_gfJmnWL_kIRCk.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-YGmyY5e7jBYy5shZqiTKHE797G8SCd36R1Ms0ajGNEA.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-YSaob-q2BXf9cccXPWde87n88g3eDsfNdyvGf_q-oCY.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-V2kjQgGFI9K0gdM5P2F47_ZDtEeEIREluVQAhHJNjA0.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-5C5D6aFpWFpX9YUvla8rMF8JggnRnYQqUW7pX1pgYOQ.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-3XHVLosiu8pvHdyWugG2m3YGKnRt0QUbASgt1LHWpn0.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Nlo_TMqF24T1-OJhvFAM7HHbEClR6_0I_DJd09RMcKs.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-bniDQlfKdV_tXhvQpknJ6lUGfzI10G_RKEsuRrhYXIk.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-CXF_2kus1XurAUTm8W5Hh14C4ozXacVK2Q87hNxA-fE.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-chRDcGmRqy-Eih70LEW4fiHTwfsZ517DXEoTomXYtAk.jar
    Sep 18, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-S-TcMSkp7XzlimFnooGTslS3MIa8xZ4pqDmYJKUUJvU.jar
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 1 seconds
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 698d7b1e4a13f0f9668a948e8fc4ed222f5855cce3a2ad961b95891d7f342883> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-aY17HkoT8PlmipSOj8TtIi9YVczjoq2WG5WJHX80KIM.pb
    Sep 18, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 18, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-17_23_45_18-4367868156119383329?project=apache-beam-testing
    Sep 18, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-17_23_45_18-4367868156119383329
    Sep 18, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-17_23_45_18-4367868156119383329
    Sep 18, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-18T06:45:18.888Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 18, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:27.496Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 18, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.200Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 18, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.245Z: Expanding GroupByKey operations into optimizable parts.
    Sep 18, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.286Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 18, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.355Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 18, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.392Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 18, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.426Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 18, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.460Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 18, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:28.957Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:29.030Z: Starting 5 workers in us-central1-b...
    Sep 18, 2020 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:53.177Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 18, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:45:57.136Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 18, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:46:15.123Z: Workers have started successfully.
    Sep 18, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:46:15.170Z: Workers have started successfully.
    Sep 18, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:46:47.178Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:46:47.341Z: Cleaning up.
    Sep 18, 2020 6:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:46:47.426Z: Stopping worker pool...
    Sep 18, 2020 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:47:42.116Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 18, 2020 6:47:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T06:47:42.169Z: Worker pool stopped.
    Sep 18, 2020 6:47:50 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-17_23_45_18-4367868156119383329 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9dc4001f-64a4-49f5-bec0-f5dc03a38dd6 and timestamp: 2020-09-18T06:47:50.455000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.495

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 6:47:50 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 44.717 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 33s
107 actionable tasks: 62 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/j6ndljpiaw5t4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1008

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1008/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Avoid re-encoding row types.

[noreply] [BEAM-10921] Fix flaky unit tests on Windows (#12866)

[noreply] [BEAM-9543] Add blog post for MATCH_RECOGNIZE (#12735)

[sychen] Add a step properpy for shardable states

[noreply] [BEAM-10906] Add basic Select transform. (#12832)


------------------------------------------
[...truncated 272.81 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 18, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 12:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2020 12:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 18, 2020 12:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 18, 2020 12:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-aVCC1aTUHkFA9N_9b-jtKfBFaxZ2xLt8kpty6oe1CKM.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-8WSQ-2c694nlc_dHVbKQuISY0sNHbZJiGXJeudbUqyU.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tctD9XG8HT3ZVV-FhOTuOxhA1QHF8MikEuccBzSchY0.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-BkQYnayGtDC5W1aiN2h8Znosyfw36EY0PZ5vQz0Dr7U.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-vCoekkR21IscTGa6SOf0nl7rZFtIqrBMg0zI-YBGzpM.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5398045425933865601.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-XeN6gw8uJA7OyAr_4RU4hv5P20qK3EtdBhzj8iULs1o.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-2aNej5MizvRN-8IFp1GNdEr11onFc5nkBDc9o-u2CZI.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-l2PoSIt34MKPEfmP9E228vnF4gXzy6VzgifIZGp6KOw.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-wUvQSTqyQBxtbVadbmsq8q6W60ZPj66A6sxJ4qDb6bo.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-rMmkTQ6P67n839Ckhz2ynpInp42W4i2z6gxm3917Ysg.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-vSExpXudm5LFQgo-TjT-WTAN1t388Py_b1RRNlf2XwY.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-sq2I4dKnxZcRsxU3-_HmihU2AW8IN5pQ5brF1Yy-uvM.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-WLtoQz9w9SXRAi8oOTPxcT4yFQ0Nbs_WRIk-w_DzBdY.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-PKrMEM4IV1aIbQpob7aX3Lnn9riOmpf3niQVL0a2GO4.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-dzjp5cBvP8PZlK0Zddocmpphr0vi8VNzA69Q0m4-hiw.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-9D9WxCtr_HPNB9gHPtw6PTRKjC_Mjvt1w88728Ky-1w.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-iKwFRj9hjbKuZPorvZ8W5iCkNa96uRsx_Vh2PazeH7k.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-byzdTYOmwmcqLBtRtgQ1j5Zbx9Ad31RHCmAbXR4n_LQ.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-jH6Dmz9tQBLJ1_W_Vke9qzyFHgv9yD5p-ND5Sc9dTmU.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-aVCC1aTUHkFA9N_9b-jtKfBFaxZ2xLt8kpty6oe1CKM.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-SUACEgwd4F7efaBIzj6lM280u2ZuPAfGqrlg6mAGK20.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-octkoyjZC7WPkzzBh0ppnQIad_8da20vY4c_gQx16nY.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-dXzY_2wBQwSchD_JGXLUTRKkvMDn6ar6Wjy3uZ_0LA4.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-4Ic9U1n845UZPO0YBQhcUWPZ1dpnQSn-jJa0TWVAJeo.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-kamIxJ_qZq5P6MgKVsOWibnhthiBn4VEkCpdwiSKdlE.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-rdZZbhkLsA8rD-dPfUvZvZGdE6RzGnFboDEXMqIHeH0.jar
    Sep 18, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-77dX7Jg24U1pckKMYECu1OJiuoHwy8uRqH6Jsr0DY_A.jar
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 1ea4bdfbc970e1c18aa63ec3fe287a2e05134ac1cfe8384f94d6114af27423b2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-HqS9-8lw4cGKpj7D_ih6LgUTSsHP6DhPlNYRSvJ0I7I.pb
    Sep 18, 2020 12:45:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 18, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-17_17_45_42-4193983501382752725?project=apache-beam-testing
    Sep 18, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-17_17_45_42-4193983501382752725
    Sep 18, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-17_17_45_42-4193983501382752725
    Sep 18, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-18T00:45:42.654Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 18, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:50.842Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:51.407Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:51.471Z: Expanding GroupByKey operations into optimizable parts.
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:51.508Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:51.585Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:51.614Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:51.649Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:51.682Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:52.016Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:45:52.080Z: Starting 5 workers in us-central1-b...
    Sep 18, 2020 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:46:19.490Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 18, 2020 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:46:24.425Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 18, 2020 12:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:46:40.993Z: Workers have started successfully.
    Sep 18, 2020 12:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:46:41.025Z: Workers have started successfully.
    Sep 18, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:47:15.374Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 18, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:47:15.557Z: Cleaning up.
    Sep 18, 2020 12:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:47:15.654Z: Stopping worker pool...
    Sep 18, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:48:13.663Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 18, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-18T00:48:13.693Z: Worker pool stopped.
    Sep 18, 2020 12:48:22 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-17_17_45_42-4193983501382752725 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8435896c-4c63-4fec-b45f-72b5010abea1 and timestamp: 2020-09-18T00:48:22.697000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.995

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2020 12:48:23 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 59.809 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 6s
107 actionable tasks: 65 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/ozl7zw6kdtz2a

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1007

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1007/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-10889] Add a note about BatchElements on GroupIntoBatches pages.

[Kyle Weaver] [BEAM-10915] Fix error hint for AVG(INT64).

[aromanenko.dev] [BEAM-10816] Make KinesisClientThrottledException public

[noreply] [BEAM-10620] Eliminate nullability errors from

[noreply] Minor GroupBy doc fixes (#12860)


------------------------------------------
[...truncated 271.10 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 17, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 17, 2020 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 17, 2020 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 17, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-qXEpjlCUqavIOGmw4Vw_a2h43e7gSaOjSrRmkaEix8o.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-MwXGb0dJ-ddfpZ_2N33fFXrPXmcx5-sMoYRPYz73o5A.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8528050366632878484.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-FsZJ4rfEQ9Mj9OxRb_LscoxTWSi8_X0esgGlr1tBAEo.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-LMFD5MWW9FgpvRjxtRHrGwVel4m1EzK0TkFwlHx1U-o.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-JaHR2gf964VCQknonNHHtZooWrEmF8eBAAeJfab3CHE.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-8LpE63cLrv_bk9gHWfCfGEiFE2d_PXhf3lDk5qvAWnI.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-bNW4SQnWvnEz5pHQjrZv1o5wlfS0FaCrz4Ov7ZPBwE4.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-UngQF1ckQ6pSNQRrjeqQIQ7RYEeQN5VSD9p01oqAKec.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-h-JHJbzySfsIpIZPv6uoidQ4EHvmCz7uyUloJaOcZ4Y.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-b9Sw4wMoAExZDdZ_s4apKjYX9bbaSMZGuoVhM5XTXB0.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-mmeqxUta7a6S_tsG676A9pZ-Y2jdC49UBKcXUPVEhKc.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-CTPxdQ817usY1GEBi71jDQzSPWx6-P4HRfy20kfAy6w.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Bz8kPQsMFX10UAeCaKvyJPgE9582UYhNUyHNa7_PcDg.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-fYz99pwiTpG356ffiDhtYwXpowW9LfO6CPzkidincI4.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-urCAWLDsV3BkxmtbS7UR6XsctrEYfGmIH6kaG6FKCnQ.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-MdWBi18iaz_np2iQCPdaGBQQo3OsdI3VSPGhPLiZhdA.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-AcYAf4uuIOR3HoyjySDGv_CWyslHrCsk35z1Jva7jOw.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-AqRrzuBBv0KmBO4u7megEFVhzGRSfgdxwggY8F3IMH4.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-aFCGfbG68LAzdO9iwRQr8cfc_RneqZGXrXuP89Pi-cA.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Bz8kPQsMFX10UAeCaKvyJPgE9582UYhNUyHNa7_PcDg.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Mds5QPF3xY2g6qKQHo63YXW_xkumiT1MNRkNSBeNG-0.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-kmJYLx6IGEejc_URH4NLV9MXSGM5lxU5lodANrPAnxM.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ueqlmTTesOxNv3UdvZJyEKyBfwpLH5cCwCYGxs7tmRM.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Vj-xPRLRtvspxzYISVopJcIppDkzsGXo3aTADK2qDRE.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-nYjYYWJG9bkJX1mvfpn6ySQKOW8F4mHhgZdanbZxMJI.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-yLCbXobMYkLNrSok0kAKODshF940QZwkYOn4JlDmdIs.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-y3AScwVGo6SVURVPxxGSxDiK_w3LiYVtp-e_kFwt5eU.jar
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 17, 2020 6:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 17, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 17, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 17, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 17, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 17, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash d21253fd5518a11325f6536911b7d7bb8b83ee6c6e9c7860a0866bfe7855b056> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0hJT_VUYoRMl9lNpEbfXu4uD7mxunHhgoIZr_nhVsFY.pb
    Sep 17, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 17, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-17_11_45_28-14493791183980167453?project=apache-beam-testing
    Sep 17, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-17_11_45_28-14493791183980167453
    Sep 17, 2020 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-17_11_45_28-14493791183980167453
    Sep 17, 2020 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-17T18:45:28.432Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:36.318Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:36.930Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:36.958Z: Expanding GroupByKey operations into optimizable parts.
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:36.983Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:37.044Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:37.071Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:37.095Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:37.120Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:37.572Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:37.637Z: Starting 5 workers in us-central1-b...
    Sep 17, 2020 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:45:51.149Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 17, 2020 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:46:05.063Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 17, 2020 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:46:29.950Z: Workers have started successfully.
    Sep 17, 2020 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:46:29.991Z: Workers have started successfully.
    Sep 17, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:46:59.582Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:46:59.711Z: Cleaning up.
    Sep 17, 2020 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:46:59.792Z: Stopping worker pool...
    Sep 17, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:47:50.779Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 17, 2020 6:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T18:47:50.830Z: Worker pool stopped.
    Sep 17, 2020 6:48:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-17_11_45_28-14493791183980167453 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4f781dfa-23e9-4e30-8b5f-cade3f7a14c0 and timestamp: 2020-09-17T18:48:00.296000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.685

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 6:48:00 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 45.432 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 43s
107 actionable tasks: 63 executed, 44 from cache

Publishing build scan...
https://gradle.com/s/gjct57odd4cas

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1006

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1006/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10870] Add raw private key param to snowflake cross-language


------------------------------------------
[...truncated 268.62 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 17, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 17, 2020 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 17, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-td7wroA-6gpSwkIDGT4XqfckNmh7u3c7hEzU_9Lalv0.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Y05vd4q3zGoN9FVYpkrcCEJpCBpM-iz8bNMT2kNUGVI.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-jLL4cLkqUaPDlylaM1Dyw7QbbgVIQILvuIov-R3vOEA.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-t8uGJ3rnXBe7EH632j3W40NIArS9ejEGh0qGq4iWCds.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-9pWTMmlk8sGM9cYcaNuJAcfuuQlZ4Mlm43-8bYnhnpE.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-jqmS_054vJ5FzJwk9745-ITzpgrrQ3_GqjBpXxvFmbU.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-eQunNtEiEW01-C0xvX6Y7sw209-q_KyChrV8GS4KctM.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8069727198249426218.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qoaYzzIxvkkqwzWWUng4Dx7u2AtEDF91e9IAXMGkklU.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-wNxgNd3FcStoSC4YHtGvBlrInEZRtX1Fu2HYegJsgQk.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-LDmbLxRf0x5DocuXxboaqF0uMbYlssks-t4AIncRFDw.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-vbxTIlqpP8xNluXCNpwiUBNp711SMyav_L-3bCUX-WY.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-td7wroA-6gpSwkIDGT4XqfckNmh7u3c7hEzU_9Lalv0.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-zzXC-sCD2xT-SxkU6Dxz19AQxPNWoTdFfIIjj-9vBKQ.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-6sWyRkHV03Mqy3-CsnsR_sU_iDbE-wE9hh9jwDcUqoY.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-hc4BIT90m57vMgmVq3F7t8aumFkK4fbF2KVBCKxHTdA.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-wAE2WZXW9EYmXR_bK4zkdTeGoNVWdLGeO5rBOY7rys0.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-ln5KDF7ob7FKZmV34Nj4fGXHbwjmIsQ3OYTPaIWH1TE.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-F9IVOEne_wEFhfoxW5xwqZlxTS5XawhcgHffTZlTexU.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-pjeqmkIXn8pPR8-MpnZiT41Op5_yxvYoqYSNmSf6PDA.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-wU7S5U2qxR0A5GeRUGJNbFO0mtY-OMavKp1BY2nfTV4.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-Yd4udaAlE2ubO8kFfTuSwMLQDvaROPM8iq024EeWVL4.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-nAYpPQ7R68GhMd9hlVSCbtO1N0avva-5OoE7Fj4-c78.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-royQyvJsCqmvcusF9ryKD9WigCm1B3azDgN-YdOz3sg.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Ex7-IqHXD_3BMED3E3u1Geh9PhX_x-uANM7w2-FvMWs.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-p6pnqsueQShMX15Kqr9IvRua_BulQyUmWYAxoYNnoM4.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-xkJT0phqsjM1PoOT9gaPxq2iceqhR1Qfz_KXfh2Zsps.jar
    Sep 17, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-XrWb5_1NsbKRJAOnfYjE_kfAC27u-fAtdmtqbOOs3-o.jar
    Sep 17, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 1 seconds
    Sep 17, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 17, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 17, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 17, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 17, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 17, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 2f94c181363d75b388aa9c6fc7ef5948a12789fe5e23bcc01baa545f8d0ba66b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-L5TBgTY9dbOIqpxvx-9ZSKEnif5eI7zAG6pUX40Lpms.pb
    Sep 17, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 17, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-17_05_45_23-4785705186495463209?project=apache-beam-testing
    Sep 17, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-17_05_45_23-4785705186495463209
    Sep 17, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-17_05_45_23-4785705186495463209
    Sep 17, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-17T12:45:23.162Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:31.308Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:31.943Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:31.984Z: Expanding GroupByKey operations into optimizable parts.
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:32.008Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:32.113Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:32.196Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:32.283Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:32.352Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:32.938Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:33.025Z: Starting 5 workers in us-central1-f...
    Sep 17, 2020 12:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:45:53.253Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 17, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:46:00.868Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 17, 2020 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:46:24.601Z: Workers have started successfully.
    Sep 17, 2020 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:46:24.630Z: Workers have started successfully.
    Sep 17, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:46:55.598Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:46:55.806Z: Cleaning up.
    Sep 17, 2020 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:46:55.898Z: Stopping worker pool...
    Sep 17, 2020 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:47:38.965Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 17, 2020 12:47:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T12:47:39.027Z: Worker pool stopped.
    Sep 17, 2020 12:47:47 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-17_05_45_23-4785705186495463209 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): bf9c2f37-8e61-41c2-8f19-ee1f502a082f and timestamp: 2020-09-17T12:47:47.743000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.248

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 12:47:48 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2 mins 38.314 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 31s
106 actionable tasks: 61 executed, 45 from cache

Publishing build scan...
https://gradle.com/s/b5aikmhr2en2e

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1005

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1005/display/redirect?page=changes>

Changes:

[Robin Qiu] Support UNNEST an (possibly nested) array field of an struct column


------------------------------------------
[...truncated 272.86 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 17, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 6:45:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 17, 2020 6:45:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 17, 2020 6:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-_RM7kOkRfEdXWBeUUb3im2H0Rdb4HxqU6-AJK2Tq3aE.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-XYwLPrd9mePoib2WOLYNZO2uFerPfyPUBuAtd8LRycw.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-4VZE0kUb91z3TGp_Q7ack0ksRExoYDZfFxxzZuigQrM.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-ExJAD_lqmGhMVzGh-Rqwr6XWHIsXa99zNbdi-9bH8E8.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-_DULmanrRQWUXj_cKyxnTxxXJKI0YYDUMx8mGhw0vtU.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-AsdNMJo9mjaJvW3i4Spzjx3yf9bQrUMgQSjDiLEpqr8.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-MGm-W3ZGSKlVUnz9FmF60sYqDvs0TgYKMUmLb3Qt_Gc.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-5W6OvqVjNGwAwSJs00GZ_eahnPkxd3VCY3iRzDz_YCw.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-_o1UfVgkShY3zgnLQ8EsfpHFhDz_KLd7tltsS5JMB-g.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-CN_dHMRaBTvQVYrDrmFELjEkeS-OdyIub7nWtXLQIlw.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-gJ6xMxFjwguN3ILDmIb3n86hQdWkXpXMNm6uxVWOekE.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-_RM7kOkRfEdXWBeUUb3im2H0Rdb4HxqU6-AJK2Tq3aE.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-vw00BcJbMllBP_jxvxk1Jbvv_EkD3zg04bNKS7sJ9y0.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-dkOM8r9gB5ASDujXPVTNFT7NRz4obUR1Y6De52zh0sI.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-P8WBhQwNbug-LAF2lwaoB6Q6KKzAlOO-TEyMoJOIKwc.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2194341051306486805.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-YmIP4KUF1gdkDuK8kdG7B7KJzNtBkokpTmXTqdBaElw.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-FDcexH2fJ0DoA-wTUkBjiSrnj4skj-ftqt-hP_zwcz4.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-NZrmXY0biEQClW9QofjhlynI2MK6ByHKcGBh6pFxxoI.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-W34W2XyBzpCahbEZF48u4XhBTn2SwEGtxyChL5xKJew.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-wvbkFm4GFlWR6AET7m8fQHm-8-63iSSdV9P95IAZnlE.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-9_Zt3HnQEIJoJc7eBfvuhVrYqj5XjJbFiKff7U-hW6E.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-IudMEVamMlsfuBwd4uWWW_tp8P2PwTxafhrUPBbKJQ8.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-hyrL-jfO3UHrut2VVqB3iScdMY8GNrVBb48hMbb6soo.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Zz1oM3z7cDhY7q3BL0ZhQRkmh8g7Ei7XdQ5Y4HYq7GM.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-ToQdRuf6kNofHEnIJHUwBjPTS8KWygWW-ZSqsGfWCoI.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-OOe7PLlvDJvu3OlT9BAQFIJvO-0FD1lB2RiU-QB0fsU.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-0nEd9DBG5LDDvh-_fdzH9X_QKZNmVTu7r_oDuQ3xKKg.jar
    Sep 17, 2020 6:45:48 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 0 seconds
    Sep 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95442 bytes, hash 420fc2311209167709222783b9ecd12167982b7eae3ece26ec6e1e631eddeb48> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Qg_CMRIJFncJIieDuezRIWeYK36uPs4m7G4eYx7d60g.pb
    Sep 17, 2020 6:45:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 17, 2020 6:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-16_23_45_49-7910135258459698124?project=apache-beam-testing
    Sep 17, 2020 6:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-16_23_45_49-7910135258459698124
    Sep 17, 2020 6:45:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-16_23_45_49-7910135258459698124
    Sep 17, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-17T06:45:49.664Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 17, 2020 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:57.209Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.140Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.172Z: Expanding GroupByKey operations into optimizable parts.
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.191Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.259Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.311Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.346Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.386Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.855Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:45:58.934Z: Starting 5 workers in us-central1-b...
    Sep 17, 2020 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:46:31.425Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 17, 2020 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:46:31.761Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 17, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:46:50.487Z: Workers have started successfully.
    Sep 17, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:46:50.521Z: Workers have started successfully.
    Sep 17, 2020 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:47:25.423Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:47:25.690Z: Cleaning up.
    Sep 17, 2020 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:47:25.754Z: Stopping worker pool...
    Sep 17, 2020 6:48:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:48:11.550Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 17, 2020 6:48:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T06:48:11.595Z: Worker pool stopped.
    Sep 17, 2020 6:48:19 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-16_23_45_49-7910135258459698124 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 40f482cf-146f-451c-a72a-0f31fd8c9273 and timestamp: 2020-09-17T06:48:19.427000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     16.86

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 6:48:19 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 44.304 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 1s
106 actionable tasks: 64 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/xb6ex6ioqdleq

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1004

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1004/display/redirect?page=changes>

Changes:

[noreply] [BEAM-7009] Add Go SDK Standard Coders yaml tests. (#12588)

[noreply] [BEAM-10907] Revert "Deprecate obsolete CombineFn.add_inputs. (#12802)"


------------------------------------------
[...truncated 274.77 KB...]
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 17, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2020 12:48:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 17, 2020 12:48:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 17, 2020 12:48:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 17, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 17, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-dwCDT-LaeUvr1MTyxitei1cixow3WVIXTk4bE0iTNsk.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-QR1RCOtkHc1JIobo-Gl4Sjfs6uWLBmSH_m0XPvK4mvo.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-pHb_Ep2jtDuPVSNtMq_8q-Utf4VUrKGqU8b7iFcLTUQ.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-B7LY9BwObfJDzSE_APz2GfritLd_3ffBZ1cp2bd3KHw.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-JE_ebhMcnISAP2M4tdhncVMF2CPjzOzLYo1SZqlKKm8.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-63bujuWzyqaoqen8ZWNgvnANJXvWJ7eXCGADm70DjNw.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-I9ldQchvKXB64yhDpPCMf2rdtmEVchUTRhfM5wOiT68.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-jZjoueqBye2g0Ik6O6uIv1Q6eSpr7czDXX45jywekVU.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-9tDhgLDG5a0Ac6tWv9Mmmgbj0b9j54Saemuoe9N4Dio.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-vlZdFtQyl4jTX9O-bTLUwOgbn805zqhgKaZ8gZwgOcs.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-MmeC-bgAChdNiO86nvsnuetJRRcnfeDYBBHPq21tljQ.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-oQFP3FPr8TvOheXOs5A2vMhuU4bTzHqmThmOBH8REP8.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-ih2x0hjeuOCpb6U1dpKPnLe_7A3W7KJg0CNdd0MPE6w.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-dwCDT-LaeUvr1MTyxitei1cixow3WVIXTk4bE0iTNsk.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-IuvruH3qzDeor1m6E_yC5zVIvpwg6SZMM7fuuhhlD_Y.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test47767524104222523.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-99YpDolLQfzKpIxHaVx5lgU4rcLHiwXFUSl2kuB6sYU.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-obiY7gLPP-_JRkl609Hkg6_n7HbTzupmSrkuvwxGg7A.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-NFMlFnqihkHmAmjlM86yVG8HVePOgDZfOfyi6ZeJtT4.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-339nP12PisLhqzW1SUtfIfnFFwN4rsk2P5fuIzPRnAs.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Hz6gwgzKZl_ZAGZwIJjcu2WYooqjM9Za4vRCR3yMMds.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-tXgeOOmtRwQj53xHp8OvBLGscZQMe8Xh31u_0KqSeWQ.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests--opWwWGNR1jkDxBk0Q2BlrA14IAcd5Xwfln2I-8bpNw.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-40LaTqSdIeRQXP9-IGvKtpM0B4RN10TAtgki8MFqKzI.jar
    Sep 17, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Lo_FNlr4i-OwDbVDmvD2qrRGcJUYnfFLY7tIk_IGqZY.jar
    Sep 17, 2020 12:49:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-JCWzKIH4IdtW1vEY4-8PmGKXOH3VqvX7Qd0Td2PeiKI.jar
    Sep 17, 2020 12:49:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-tkMlbEOvn8H850acISyovt1hy_mNbYu6BFJtUcNJMPY.jar
    Sep 17, 2020 12:49:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-Ix7pLsAWsRvcEJRGU8-HF0spcWohhn_xIKf_LYOcHF8.jar
    Sep 17, 2020 12:49:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 193 files cached, 26 files newly uploaded in 8 seconds
    Sep 17, 2020 12:49:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 17, 2020 12:49:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 17, 2020 12:49:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 17, 2020 12:49:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 17, 2020 12:49:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 17, 2020 12:49:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95446 bytes, hash 8a472d4fece699f93fbf25b1cc0edb0b156a9f95d57b9f7f53b15c9604abd7f9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ikctT-zmmfk_vyWxzA7bCxVqn5XVe59_U7FclgSr1_k.pb
    Sep 17, 2020 12:49:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 17, 2020 12:49:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-16_17_49_15-13930437868880093637?project=apache-beam-testing
    Sep 17, 2020 12:49:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-16_17_49_15-13930437868880093637
    Sep 17, 2020 12:49:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-16_17_49_15-13930437868880093637
    Sep 17, 2020 12:49:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-17T00:49:15.046Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 17, 2020 12:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:24.396Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:25.517Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:25.562Z: Expanding GroupByKey operations into optimizable parts.
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:25.600Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:25.695Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:25.736Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:25.782Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:25.857Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:26.540Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:26.638Z: Starting 5 workers in us-central1-b...
    Sep 17, 2020 12:49:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:40.447Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 17, 2020 12:50:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:59.469Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 17, 2020 12:50:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:49:59.510Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 17, 2020 12:50:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:50:10.403Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 17, 2020 12:50:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:50:22.981Z: Workers have started successfully.
    Sep 17, 2020 12:50:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:50:23.028Z: Workers have started successfully.
    Sep 17, 2020 12:51:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:51:02.491Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 17, 2020 12:51:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:51:02.691Z: Cleaning up.
    Sep 17, 2020 12:51:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:51:02.845Z: Stopping worker pool...
    Sep 17, 2020 12:52:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:52:02.714Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 17, 2020 12:52:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-17T00:52:02.769Z: Worker pool stopped.
    Sep 17, 2020 12:52:13 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-16_17_49_15-13930437868880093637 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c5608ce6-d82c-45ce-a37b-86a0ce5e6e98 and timestamp: 2020-09-17T00:52:13.496000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    18.247

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2020 12:52:14 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.08 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.076 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 3 mins 46.216 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 30s
106 actionable tasks: 64 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/6ox3soav7p5s4

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1003

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1003/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-10670] Update Samza to be opt-out for SplittableDoFn.

[noreply] [BEAM-10616] Add missing ParDo test cases for streaming/Flink (#12848)

[noreply] Bump versions of protobuf, shadow, other gradle plugins. (#12821)

[Luke Cwik] [BEAM-10670] Update Jet runner to be opt-out for splittable DoFn

[Luke Cwik] Update runners/jet/build.gradle

[noreply] [BEAM-7523] Fix starting Kafka container twice in KafkaCSVTableIT


------------------------------------------
[...truncated 280.23 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 16, 2020 6:45:52 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 16, 2020 6:45:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Yl2XWpcEbpQb8IE81AlqdzcSDw8H7Mlk9oe3GNnkxPQ.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-gxULC99WTlJrhwYBhsluMmHS5iE1GJ6uRzYXH6mU5dA.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-r9UbWTX1-XhymtznWY4Hdh1uymmTM-bkzFBRdFMFuq8.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tfbaX8oLaomDs8i8nQs0qj8u2YFAFKCH313WfgbfgCM.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-1OLzrnwI7niiWxh99ufD7-Xff4xkdwGnffDDo1rFrdc.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-H3v29aFEe7HJdzo7ZSBmUpomEkmaHP2c-CV0ZzJ0R-A.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-CqZselwo9o9bna3FVIY5Xyhlkamn27C9fFmqFv82-70.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test576096033016983519.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-m2roMSdhY_eudOjpI0ZTZVHoyeODVpTkOjAHEytLKvQ.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Yl2XWpcEbpQb8IE81AlqdzcSDw8H7Mlk9oe3GNnkxPQ.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-wimR6Y3IZkDSB7jXw2D16zus0wW07JNBwe5Pc0fs258.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-9QBXAYqHO3lPLX-IPnbvEfeCOGhbn8284PqNyz8LU7Q.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-g1A6amZ3x0nXWBRCkfYVVl-8dMoKYcC_rx_3_dzxMk8.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-2ax4s-6DTr6k95VblmZ6zlu2kh1Lmn6Fcy4btr22e38.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-NTOPEF-Mwgv_swHTSb-OXUW7sP5N_45_vIfPq2ldztw.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-U_2S4bp9r8Bq84IsvQn2uUuSUj-dzZDuozvCzIO6vDg.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-_kspdruASl9xQ4nh4T3hUB1o1igYN1ZnAxpUQN9e4J8.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-BDNL_KYfJoi3Ce9bdeSl3h2bXU3hD9GIGBHSmVezSoo.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-onSxJkBsUdW5-gwWv3v95LjO0O6aE--IYUDbBg28QMo.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-m-k5Bji8fo-dr5-KhrqE3eaL1jgVX3quYsGxYVQo18k.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-5BFdh4qjzpYx8p0PvmIcGO_6p6Isdv2Hh68e2wtx7pc.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-v3KFjF8h9AXqILNvI9dYSqKTmfKWIr0edRdTJHlfbpo.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-uY_hLbn__49xKN5ekkaZwLhL1MJsER8-uKq9rSeG-g0.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-oT0yCZsWfC2RMDc839vDMUGIDs-PhKpVwm1fpABri_U.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-dtn4fxJbNGzzMXFRekYxTR3r8LdMLspBFkEu4VoR-5c.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Z3eCX4IZYZo3KUar93sopF2U5lIBNxFuveT4h11Hhgs.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-MSGreGJ5UEZzZqvQwA-41yYnPJ4mBOER0mm8sZlGlJU.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-STOrKhKobyxOS8k6JO2BS9hj2DFTSTsAod1QWrEevEw.jar
    Sep 16, 2020 6:45:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-dtHXHi6CujKiitmz3UUXs4sXJxAOTCH4LUcDbZ6nvts.jar
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-brvaluTKTtdQ37m4V2JEJoiBoQ6haTBylCIi2g9GpC0.jar
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-WFXWkqVJU1NYU-2z1QyxS_IJcZ83fVMUvWoQJiraVo0.jar
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-A5UaIrsp26x82yUPnEcESy_J1zHxN4TDWkS1gOM41KQ.jar
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 16, 2020 6:45:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95444 bytes, hash 1206949bda6f8270c45a12789f9266e5e1320c1f57abe8a5d370bb8a00197b86> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-EgaUm9pvgnDEWhJ4n5Jm5eEyDB9Xq-il03C7igAZe4Y.pb
    Sep 16, 2020 6:45:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 16, 2020 6:45:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-16_11_45_57-17744232215389877017?project=apache-beam-testing
    Sep 16, 2020 6:45:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-16_11_45_57-17744232215389877017
    Sep 16, 2020 6:45:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-16_11_45_57-17744232215389877017
    Sep 16, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-16T18:45:57.245Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:06.459Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:07.324Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:07.360Z: Expanding GroupByKey operations into optimizable parts.
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:07.406Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:07.486Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:07.528Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:07.570Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:07.610Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:08.056Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:08.135Z: Starting 5 workers in us-central1-a...
    Sep 16, 2020 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:37.087Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 16, 2020 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:37.121Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 16, 2020 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:37.501Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 16, 2020 6:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:42.515Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 16, 2020 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:59.203Z: Workers have started successfully.
    Sep 16, 2020 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:46:59.229Z: Workers have started successfully.
    Sep 16, 2020 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:47:29.293Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:47:29.472Z: Cleaning up.
    Sep 16, 2020 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:47:29.554Z: Stopping worker pool...
    Sep 16, 2020 6:48:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:48:20.131Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 16, 2020 6:48:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T18:48:20.178Z: Worker pool stopped.
    Sep 16, 2020 6:48:34 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-16_11_45_57-17744232215389877017 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a27bc37b-aa34-44d0-b380-7e049d17937f and timestamp: 2020-09-16T18:48:34.336000000Z:
                     Metric:                    Value:
                   read_time                    10.627
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 6:48:34 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.039 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 51.942 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 17s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/lkhocbhcxclcm

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1002

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1002/display/redirect>

Changes:


------------------------------------------
[...truncated 280.15 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 16, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 16, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 16, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-nmL4C9w2QukrugxR2tYGMp9ALnSsKlzNVHhdelznqeA.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-51RlGeYpMLMEN1cy2-xcXZsQYXBcrr0EylzNx6afg-E.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3621666794091506746.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SbKq1h-6cJdWa7uWAaspc3mgK05TWD3Mt1A9jvouOjI.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Cjd0mEtIPi4iqNIhuHZ8cwyj21eFUjSZS0RiEhifC5g.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-hC_3ZhlnOIqGktFDEfIieLA73NHSHeoZXpNTW_GVbsY.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-tTjW4T0uuBDNAWR8G8dSx0_jpnxgppnfCg2HCBdv440.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-2u4n6rP2eE8VJxOLmH_keAcwTGrMHL9WQ7P1t2jYIoc.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-8iDAZ1olcbpAcAP3ZWEPbQDjpeojRJXTEzkqLKYgI-Y.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-jtIVPXzk5SlS3QE97P1UIz3UZ3Fuz449mrJhy5NeQZE.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-61OF19IX8ARU9x1ZCom0GMWwqfcza953_APM1XY-oTs.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-D-FurGy57UsWHZr_sHY2R4GIW2SDlLI8FOgSTvrnsHo.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-ScqOSdz0A8TSWAyyUc3qbtW0Oj7uEhXmzS9D2FZQBUc.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-nmL4C9w2QukrugxR2tYGMp9ALnSsKlzNVHhdelznqeA.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-UFYxARfDybLdYs1M9YWRL4unbDA06_0hy6o1mkZB1ZU.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-CFrDK_eBeqL8PBL2f9lOHpjJ_3XPfK21SmmyC_6VEVY.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-Gkh0z8xFTJFByzWqarhUuXjNyyKcOgv-cw6nE71u2zw.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-XtdQa5WT6GrP7AEKdk2WROMUcHhqX2D5FF1Hz9Db2cM.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-z6Y_NrUcj5lexv_KZrgeH4shWuAY_nOztNdBO22K5wE.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-PD2olcvutIRb4gMkrgXgb8gJUctG01zHtHbGiEJy4bk.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-Wd8Q8Kwdabnsc4HXyKSVlbFUnzhfWvxdFigXZzcK2TU.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT--CmF8d57760uieCPQt2BQI462M3LUV1lk7hnK58fbU4.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-oHcKk0CUABK0k4BgFzuwZiE3xqxA7qTcabh8WG62uME.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-s6lSTliAYLc_-NIUzZlNaMZfV7PQ5KNAomDgZSSKyB8.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-7WyR4DZR6zlAE6aRrOs8sTCijptUNqDfn3uwaKWd_KE.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-XcZjsO20n54gsYr_8AWcnHIQ17Brv1cdwtQ8YOzDce4.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-dJPbPLqBFoXQsnux5Ccwg8XulV5J2t8Z30dTwiJENHw.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-7PDhOk5qsdJCxcztkN-cOxHQ-EHzkm5BVqabLe-_QRY.jar
    Sep 16, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-tNxSBE3yOynJtJUVNFzW6qh_pS3My19WB492BE1R_b4.jar
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT--y77zi3cm0sHv3S0e_5kJgHRh4WaD8C693Tjl68JECI.jar
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-ZRs_sT8p2zm3Tqx167pZ4qWgWmp8Yurx23_RnfhxOuk.jar
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-k-an3t3icmNB4haQ_71ChjTx8e9ZQzXviIXi74-4gLo.jar
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 6bb2c62983fa666974b536ae931950511388c8149134b83ae65da77c41cac3be> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-a7LGKYP6Zml0tTaukxlQUROIyBSRNLg65l2nfEHKw74.pb
    Sep 16, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 16, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-16_05_45_21-15069650673728437973?project=apache-beam-testing
    Sep 16, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-16_05_45_21-15069650673728437973
    Sep 16, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-16_05_45_21-15069650673728437973
    Sep 16, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-16T12:45:21.994Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 16, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:31.805Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:32.464Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:32.519Z: Expanding GroupByKey operations into optimizable parts.
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:32.549Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:32.634Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:32.653Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:32.676Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:32.702Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:33.099Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:33.175Z: Starting 5 workers in us-central1-f...
    Sep 16, 2020 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:45:59.228Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 16, 2020 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:46:06.294Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 16, 2020 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:46:25.193Z: Workers have started successfully.
    Sep 16, 2020 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:46:25.223Z: Workers have started successfully.
    Sep 16, 2020 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:46:58.920Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:46:59.084Z: Cleaning up.
    Sep 16, 2020 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:46:59.161Z: Stopping worker pool...
    Sep 16, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:47:41.400Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 16, 2020 12:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T12:47:41.450Z: Worker pool stopped.
    Sep 16, 2020 12:47:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-16_05_45_21-15069650673728437973 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 575430a6-1817-4e83-b86b-73e1e6c09fbf and timestamp: 2020-09-16T12:47:50.062000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.657

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 12:47:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 44.102 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 33s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/j7gquxscxy7tw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1001

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1001/display/redirect?page=changes>

Changes:

[daniel.o.programmer] [BEAM-6928] Update changelog with WriteToBigQuery changed requirements.


------------------------------------------
[...truncated 279.91 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 16, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 16, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-PlpNHP2AVrXJNbpVCYWpS-Z5Kz6JbPH7scTSOfgCk3I.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ev5DcQT5jGR-WZq-ZWGbpDAnYBdzrSZOLoPTFMciiF4.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6521923645571006288.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-JFiidMgtwWNyRF4AATwcugDaGa8HjqexZtlFLDjN7nA.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-dGN2xv4mExAe0vVuO6AvTokjoQ6Equh4E4fFZgZgAlA.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-eUq97lwZoba4ernCx7g_4dkM7TD4u9U9JAh9pnhh9P4.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-yNh6LjUMHh_kJys-7QxdS35_8Vk5oDeK8b05LE9dFQg.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-kPFfy-FS1ZIsXRxeq_szbhafPMhXbN1wiKZsr2Q5ZpQ.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-3_LUR6G-bgwJ5SkO6OEoDakkQrOBOcesNmoo2YnsaQY.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-j3AFvG2M051Tbfkwoz73Sn9o_YaawXH1ipljBFz2Z6A.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-coSh5HSbpEU_ryJ4Bu57Id4QEH80LPGpQaD3LRKp63A.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-5moEgJn6KDTBlC3hkxvtFyKY5We3V3V_zbVsDEMn2SM.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-2rtR1unJm3rN5LPLndIGRjUfq3AQCzZCddiKVVXHqtE.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-y14MkyiSEzT-IqW_QhASmP23SKbw9Cpeo0yszYgFGX0.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-PlpNHP2AVrXJNbpVCYWpS-Z5Kz6JbPH7scTSOfgCk3I.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-QaXjpTNcyq5_aqnYzipc-2BmSeVy_6eo55Xb6LKGkH0.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-0J5LjgbLAhfPYYjtlmHeahf951yUw8NoM-ESPgyhEoM.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-2UW7wmnGxYL3lVO-BZn3-s8CWTbd858ODbKDa-nZCMQ.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-YiMcUsZSVf5HdAvEqpEmSz-XCXAPgfOc-yhLj67FR8Q.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-26AKMWm945UPQn4IMJSEZPeYejnVEobl28ktKfrz9_A.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-gnnWqi99lykg0m-tDAO_caa60w-MZKy_Ui2cvXIt1PU.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-fh5cXbudHfjlg7I2tBUGbywENg-hkEgGXvtWrZ2YAsU.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-uYWYszhZN9n_utjJDhLxX6hFS_YK503lmcwkKwGVl44.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-hKEoSgi-GaR-GcocZ2imeyPhnaapUaDxi-ldJQhnbgM.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-EP0nVgb-PYh6cwtgc6HsyRkld7qVQ3Dl_3Xw5Yw3yio.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-BFAjKe_Sw0WU1jkoRpxdGqR-46jv9pfbo0fWiIX5wuQ.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-nHcIVwLfEfBcnTTmLmrLBh5etRilNSKza2GNpE-40HQ.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-ygKlEV8J35L4GW0zTOvg-qhKnRDjvmaxc5VXNjdlL1A.jar
    Sep 16, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-rZZGCwXmsJ155W3WO5gRaD3CE9OeYQFnUgQIJ_oHCrQ.jar
    Sep 16, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-PEf6frSkbETB4TH1Yx0n78hLPcZIUWUedONjzOVMyUE.jar
    Sep 16, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-4wSObsZJCp3ZrLPDhFza59FPheRpzA_YbjhVJd3SAX0.jar
    Sep 16, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-PrU521v9DYhdP7yFWvMX4b413ToxF3vy0Ws8j_YBeUg.jar
    Sep 16, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 16, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 16, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 16, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 16, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 16, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 16, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 22b54b2eb4203b9f91877121b281db6a76582d103a295ca5bae0d27b560e81a8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-IrVLLrQgO5-Rh3EhsoHbanZYLRA6KVyluuDSe1YOgag.pb
    Sep 16, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 16, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-15_23_45_18-14711656897969555355?project=apache-beam-testing
    Sep 16, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-15_23_45_18-14711656897969555355
    Sep 16, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-15_23_45_18-14711656897969555355
    Sep 16, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-16T06:45:18.537Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:27.233Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:27.946Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:27.989Z: Expanding GroupByKey operations into optimizable parts.
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:28.016Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:28.152Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:28.177Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:28.201Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:28.222Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:28.612Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:28.696Z: Starting 5 workers in us-central1-f...
    Sep 16, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:45:54.011Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 16, 2020 6:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:46:01.649Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 16, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:46:13.009Z: Workers have started successfully.
    Sep 16, 2020 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:46:13.047Z: Workers have started successfully.
    Sep 16, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:46:48.465Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:46:48.633Z: Cleaning up.
    Sep 16, 2020 6:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:46:48.716Z: Stopping worker pool...
    Sep 16, 2020 6:47:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:47:32.136Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 16, 2020 6:47:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T06:47:32.183Z: Worker pool stopped.
    Sep 16, 2020 6:47:39 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-15_23_45_18-14711656897969555355 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d7563cf7-33ff-4199-ab98-d914c5bc37de and timestamp: 2020-09-16T06:47:39.904000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    15.511

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 6:47:40 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 35.299 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 23s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/ssyu7uvrts5ku

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #1000

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1000/display/redirect?page=changes>

Changes:

[txian] Support NUMERIC in spanner schema parser

[Kyle Weaver] [BEAM-9575] Only copy the Spark runner jar, not whatever other jars


------------------------------------------
[...truncated 281.26 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 16, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 12:45:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 16, 2020 12:45:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 16, 2020 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-lGD4cGIAf4wy_fyqVk5_YIWRnk28ALxJT4TGBkU_jRw.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Uo06AnssB0aAmNpcBQEAUiQCxl2F4WBYoHUaeLFaA2g.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-lGD4cGIAf4wy_fyqVk5_YIWRnk28ALxJT4TGBkU_jRw.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-_V_wBWmFrhMxWGphLn9ML5XUHiDW6G_PBDbAz_37q18.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-z-w3YVRrp9Glc1tis_27Kfn-DPakmScLNVEH8Mb3zTY.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-VgsalEcERZ4gjPz5KPUaN0cI4ZATcd9Rqdrk3qhXqew.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-sFulctzKu-dV22bydWUrwCkzgeW2p3eOdpRD0JtV7kI.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-CWPT5r6rJD5cfc4-bRxnzJEFnHg1RQdqWZ6dCNwYd8s.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-vFFflHk4GDTJibYLe85xRkDKSCT6T0sty6V9mM8X5FY.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-1AaUqxuIP7tDekYmeSxeYyqvIfdhYcZxiea95M-M-xU.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Y9b7ZGo_KoAGs3mgEPXNUbASAOYkpHUd6Hw7Dr7C40o.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-8MmMfKJ-6FCpONpFdE7OMgeIQG9YNURwj7LPMGlu9tk.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-3N4fEOThe_EWMjbg-8QidAhpt5ti1u2Atdd6a68L0EQ.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-I_1bwSwVl_MJRuVxMui4W3j1I035caR0Aq-XRf3xoVM.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-Jwapr3wyfrmmA8kPWPOaw4YzOBxMvreA844_FlCHq64.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-Y0ffWLKf5NoB0EupT33NMgWgAiYYNB5KtazvWRHEdok.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-VofPVylV91GZYKPUa2ONeBYkQzh4tGJNgUUWvFU52qs.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-pV7mNkam8g0lGgvwxaJ5vXQDjPLI7IxjWeNX7FY3VCc.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-PBOPummQvYy-5ZY1S05kXSoVZTpiy991fZZvu4uhc9c.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-zWla5RFV1-w2F6cXTc42hNJpsoO4hacYHOeYI-CQAWA.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-A2A8AjjE6Z336VZ7rjkCmww8kGc06nfO1NdbPY5-ONo.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-AGt20y6-Z8dDKezniRqkICWX4ISNnKu0i4He75b4aCU.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-HGEmLAAo0aHHai1MeXwZpf5RAduFoZB19746DIYBVxo.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-fiprJUuVj7HOaxWftxckiOepgPWMBhgX00ae0UFk8Fo.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-uuiLvhHIKycqMdUu6ZtezK5pRnMxp6wYN5VQzQTCHQk.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2853864065336070637.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-6yo2WsaosGAqjJrecyPnLXMUWFRwxtxiU0-TkYQHkxM.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-3TDk2Rwrq5D_FAKaFgEnyxtdnZQ29Fy2dxKV8xT0gpE.jar
    Sep 16, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-RW95Xb36QedKHcRDVzwt4AqrYBTlwvci3wGMdjRJsaY.jar
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-AfBjIucKF7iAdQiLoNZ82jyqQCVeU1O49aOEbmh11vU.jar
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-osNcjQ7wV7NWrz9fnmoOgURqF9uh_aaJ5TBnpOlKoPo.jar
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-CnZhDA9pmbKmHfR6s_x9S6fapbNvrZLEvD_mL2-0jm4.jar
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 16, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash f37b7b724a9962517d5877da0e202f2b86fd1c128a20e4c7a90c350defd74a7a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-83t7ckqZYlF9WHfaDiAvK4b9HBKKIOTHqQw1De_XSno.pb
    Sep 16, 2020 12:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-15_17_45_45-2233762094182272882?project=apache-beam-testing
    Sep 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-15_17_45_45-2233762094182272882
    Sep 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-15_17_45_45-2233762094182272882
    Sep 16, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-16T00:45:45.263Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 16, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:54.466Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 16, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:55.399Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 16, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:55.458Z: Expanding GroupByKey operations into optimizable parts.
    Sep 16, 2020 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:55.489Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 16, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:55.574Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 16, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:55.610Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 16, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:55.647Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 16, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:55.679Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 16, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:56.036Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:45:56.117Z: Starting 5 workers in us-central1-a...
    Sep 16, 2020 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-16T00:46:20.605Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 16, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:46:23.574Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 16, 2020 12:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:46:42.438Z: Workers have started successfully.
    Sep 16, 2020 12:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:46:42.473Z: Workers have started successfully.
    Sep 16, 2020 12:47:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:47:22.448Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 16, 2020 12:47:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:47:22.604Z: Cleaning up.
    Sep 16, 2020 12:47:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:47:22.680Z: Stopping worker pool...
    Sep 16, 2020 12:48:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:48:11.665Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 16, 2020 12:48:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-16T00:48:11.707Z: Worker pool stopped.
    Sep 16, 2020 12:48:29 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-15_17_45_45-2233762094182272882 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9f8c3184-fba6-4fa2-a7c3-6d5aee9f7c9b and timestamp: 2020-09-16T00:48:29.177000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    21.642

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2020 12:48:29 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 3 mins 0.053 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 13s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/wnepw6sgbwugu

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #999

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/999/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Use index for partitioning of elementwise operations with multiple

[Robert Bradshaw] [BEAM-10873] Introduce partitioning session for stronger testing.

[Robert Bradshaw] [BEAM-10873] Use partitioinig session for tests.

[noreply] Merge pull request #12704 from [BEAM-10603] Implement the new Large

[noreply] write to file ability for java Nexmark suite (#12813)

[noreply] add readme file to python nexmark suites (#12808)

[noreply] Merge pull request #12807 from [BEAM-2855] implement query 10

[noreply] Merge pull request #12770 from [BEAM-10545] Assembled the extension with

[noreply] Document GroupBy transform. (#12834)

[noreply] [BEAM-10886] Fix Java Wordcount Direct Runner (windows-latest) (#12846)

[noreply] * [BEAM-10705] Extract and use the filename when downloading a remote

[noreply] [BEAM-10641] Add eliminate_common_key_with_none graph optimizer (#12787)


------------------------------------------
[...truncated 280.35 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 15, 2020 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 15, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-UZHdoOJj21ON2QOvSX5n3yk2leyPyUl6AQF12vyb6qs.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-UXq3UAiZ9kiQXd4at19jok318qBl-rxmXgPMTM2W6J4.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-UZHdoOJj21ON2QOvSX5n3yk2leyPyUl6AQF12vyb6qs.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-qsuLs3xYkAOqLE48zielqwQYBJBsclSIc431FEyJ5kA.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-cQDIu1cCC3JF6NVmjZyt_DGT0U5ikTVD5Tjzptf6InY.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-KXmwuBPfvMEumKQalOJcuCfqo9G8M-Obi4VVhdlR7vA.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-TFa5_GC643ZgbM86mPfmAAUd1o7211bnf0Va4LXBvIY.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-iYobWmzhXaik9CFAIehzJm5ZH2eEKs6Eyfzy9jVIRbI.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Q4ewG4XiYSmP95Bo3f_VuOoxxICzgM3ROYzKLbmtyIo.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-mEFbUWNeFQaej8odsy22gjaNId7EmEtVDlHewCbxfqk.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-_fYwK--mE7D3twb3AFn3jM_XKbTEkJfmvT579Qvg6XU.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-kUb5sVV273IqtqJrn9caYF4zwu8EOORlQXHeut_q4r8.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-DtpJ4ITOv2xivGkCDRs9QAR_b0EhyY_TKQgI3iAlQ50.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test615900106991142805.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-85s-S0vrjskpgQd1WP9z8oU6rvEPO9_wMkC9Q61y__g.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-yxPy_3lzTv8dZcvyu0yM_sfoctUmtaR0HVRdGfWe9uA.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-rPcluKr2Xvp8Im0yEDofen2Klfe_yo_lqDLyFP4JFOE.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-3gTBw0njfCLG-M3faDG32a7khc-XLFATnM1gxelimjQ.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-BfRzv9rTnxR6v69iDXIycKGzFXnA42pB4rCpGMWSIr0.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-e2kfPC0uJ90jknnxDEtYZ-unrmr1oop3o3SUniifOOI.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-1LBtPnNKuL84Sz7WUBVHcUVeoFRp5M0ZZVAA9klFgYw.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-IuSGpBG9VxJ44QsOusPwk2ZP9omJxT9vRFVECdBVukg.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-Vl8mNNFv3hdJ8T5BieJgTTe_DVU_7i--KjYo24hzr80.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-V3SqLQWTpK8L8MIooIubLXx9_iO6GoshtxgSCa7k6kU.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-PC4Yous5w1qVb4eGDRqJhwLeQ7xhnx6lXsIF7FsdiUw.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-xK9rsBkaf-ChHlqUwCcGIYy2v4IojD9R7eGWbPi27-Q.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-u3nc60X54G_xv7xJJ1mkz91I-gbpcTOnZpjlZXRROI4.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-wtm7bpOhbIbglV9aKGrprT7g151603jCzv79QcboZr0.jar
    Sep 15, 2020 6:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-HhV_FnnzfEPrB3PT3L74cf0X6AEMGn3WqNom4W_bl9I.jar
    Sep 15, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-pNb6iXgweNf0Rw1GmiuTNmhSaX-kJplvdZs12skc9sk.jar
    Sep 15, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-nY5Wf_AXhMyBU2-y6tgCva0HHhoecCeueTGMhKpIxnE.jar
    Sep 15, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-4RkIbBZeu1JZAoKPyYGH7VZmmvGh5toPlMWs-sOJtjA.jar
    Sep 15, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 15, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 15, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 15, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 15, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 15, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 15, 2020 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95444 bytes, hash 8c489501e908f6cbedbe845f26bf4d87a7ae74e8834a357b800098e8f86e80c4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-jEiVAekI9svtvoRfJr9Nh6eudOiDSjV7gACY6PhugMQ.pb
    Sep 15, 2020 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 15, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-15_11_45_26-7269939878914622645?project=apache-beam-testing
    Sep 15, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-15_11_45_26-7269939878914622645
    Sep 15, 2020 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-15_11_45_26-7269939878914622645
    Sep 15, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T18:45:26.491Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 15, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:35.218Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:35.796Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:35.841Z: Expanding GroupByKey operations into optimizable parts.
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:35.868Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:36.007Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:36.039Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:36.067Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:36.098Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:36.446Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:45:36.520Z: Starting 5 workers in us-central1-b...
    Sep 15, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T18:45:45.698Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 15, 2020 6:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:46:02.296Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 15, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:46:20.652Z: Workers have started successfully.
    Sep 15, 2020 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:46:20.677Z: Workers have started successfully.
    Sep 15, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:46:54.009Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:46:54.164Z: Cleaning up.
    Sep 15, 2020 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:46:54.252Z: Stopping worker pool...
    Sep 15, 2020 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:47:39.993Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 15, 2020 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T18:47:40.038Z: Worker pool stopped.
    Sep 15, 2020 6:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-15_11_45_26-7269939878914622645 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a3d19f85-4a61-446b-aa7a-000d0a95d539 and timestamp: 2020-09-15T18:47:48.921000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.781

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 6:47:49 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 38.929 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 30s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/yx3ka3iywxuvu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #998

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/998/display/redirect>

Changes:


------------------------------------------
[...truncated 279.42 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 15, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 15, 2020 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-v2B0WPzRzqqUhIB_3LizdqTa1tOtANczHhi0OSd2g8U.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-vBYbKYolFcq1YxMVwYMpoyq6GUF-Mpp5YHaTXHoCBi4.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-zGdc_OoTGwzCFex8ogRYzTVeVJXc_zQ_diTDXg2PnMQ.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3384055962582005227.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-RX97iu5urtuPNJDiJKUPWlqgp7pb237fCW66EYq4Tgg.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-ysrc6WY80cIE39dM1MdKq6ClwkTBImIhUNl2mTbs1n4.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Id36CA1mPC0LHQCzFH4K8s0zt95TWTcAK33uFYL0MMY.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-I7VgzsDyLXpsW1IoBglsQV7peS0yMBqsXLUrNYyzdAo.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-_nE6vVkOhSKbDuJ1Gx0eZEG5ydSlpq8wdfPsp3tSVrA.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-jNZyFWoDmaPsoq2PGaCTj5WShwpC-FA7lx0wLSioQlM.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-OM2Zg03xWg_aU3-D8blUgJZkuXimcr4t4tZduGLSyrc.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-gAGdM9qTvnOyzs1hRJUOclmtGpr0crnp49lOTB6uxfc.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-eRzhW0_wjzMyOheX7PK9v8Kjr4tNtc5VCZGwYvyaTyE.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-oA5zeblic9Owb3Jnnb_JIiiDUHkUb9qqooxjijSTRNg.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-jYEyoI0fpp-s6Z5A7857eI8IkqKAC570gvRxZb373So.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-rEpBfo4k90B5DPdJSxHIHE-_00KmWX8ZqMLSzXqrMNo.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-Gtg4Lu4_i7_D-LlV9yKT1ymCbfmH_3OIDS3e9D_0R0g.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-nT2ee2jiqI8BszraiJSYh2BoTk90D8814Gc0j8qpCvk.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-mL0h0qhs1VsvP4zDZXKlGBx998y07rkJzEZcC0lOH_I.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-U4_qVfy-Ofc_cYjWpA58xOoF3M8OoJCV8m--JHLfMMk.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-9vAPiaaL997uPRJGvMK48ESJ-BlAqIQV36vEDH6h4Zk.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-I7VgzsDyLXpsW1IoBglsQV7peS0yMBqsXLUrNYyzdAo.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-X1QTnrEg0mNUYlkqTODQrhkVJsTC8C5uaZ7krBkQBnk.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-hZTmKWsXx5tB67HRGtL4BDXarl9YtDxqnC5ikB3jfQM.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-_QXR8vKW6-O0XtjsVA8BhhLAmqFSaFSkB297klVqE8g.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-5dt0cx263kfYbgIbuNU1PRVsvqkmHdoSririLNqhpBg.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-p65iuUGrfPU4q-028-J5LNe2ex_txNAqQQXaSJJU6nk.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-C0fZc0Xv4dAlnCjlhxThy24pVy_sXgmFRJwL1sArmF8.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-ix6rcsjMkQvqH7LCZKvvbZj8pF_ZhjMiacBVPVuul9g.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-r8Bn7ZfkpOA6wNKqcSTPtcJlhwtLIcVYkc57kM97m40.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-M_pjHDaGTnXHv7dRqI7Hol0p7dD7nqlGGe4tYOiqH5s.jar
    Sep 15, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-F_4EQ1ti7iZ-E4Njo9_S6S3XB59gJvAgPZETTeXfNJo.jar
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95445 bytes, hash 3414cf9cb77a590bcdbab65cd6228dcf31337cbad2f07d46d4118a7a4ab23f58> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-NBTPnLd6WQvNurZc1iKNzzEzfLrS8H1G1BGKekqyP1g.pb
    Sep 15, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 15, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-15_05_45_20-16360112088404731813?project=apache-beam-testing
    Sep 15, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-15_05_45_20-16360112088404731813
    Sep 15, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-15_05_45_20-16360112088404731813
    Sep 15, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T12:45:20.883Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 15, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:29.780Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:31.162Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:31.249Z: Expanding GroupByKey operations into optimizable parts.
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:31.288Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:31.384Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:31.427Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:31.467Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:31.506Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:32.099Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 12:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:32.230Z: Starting 5 workers in us-central1-a...
    Sep 15, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T12:45:36.618Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 15, 2020 12:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:45:58.801Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 15, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:46:18.674Z: Workers have started successfully.
    Sep 15, 2020 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:46:18.736Z: Workers have started successfully.
    Sep 15, 2020 12:47:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:46:55.652Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 12:47:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:46:55.864Z: Cleaning up.
    Sep 15, 2020 12:47:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:46:55.969Z: Stopping worker pool...
    Sep 15, 2020 12:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:47:47.376Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 15, 2020 12:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T12:47:47.436Z: Worker pool stopped.
    Sep 15, 2020 12:47:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-15_05_45_20-16360112088404731813 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 47795574-24bd-447a-8a7a-5df917ea7a7b and timestamp: 2020-09-15T12:47:56.340000000Z:
                     Metric:                    Value:
                   read_time                    17.699
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 12:47:56 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 49.653 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 39s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/ppqho36if5enq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #997

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/997/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10781] Add


------------------------------------------
[...truncated 277.42 KB...]
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 15, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 15, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-yWdefyEFq-9-2lhkV8k_vvEYer6Zb0TTxQ_RyQPVXVo.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-WBWTMtOFu75X3HZ2ejxJ-Akn2dimPy1v31QZkoXAk14.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-RgaDLLR1btZFZhcFI5b4pgpV_6qPLbsSvUm2or7N2E0.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-AjntAQc45I49bOMxBY0QGXz5hXpdbOb0QNAaghiL9jU.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-ye8uB7tMxqbt16kmtZDwdtTsi69iOXbOegZ1WsNQRdI.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-q5bIb2bTTyEmCmVcRhGvKe4SabOGXdU721UnRmfLJSM.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-yWdefyEFq-9-2lhkV8k_vvEYer6Zb0TTxQ_RyQPVXVo.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-xCU3VkU2o5HA5IR40MPV5qh7cYGxWFcC51ZZ39XZd5E.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-vbh2njJAv2JayAJMBmPFFMqFORaDHtYSEk3Me2byHBo.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-wVG24QdiMfKOxIyLmq38iCuaQipf9TzCjlDWV-swjkA.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-LK0mt87eDaDI9xk2f7Zfa8g3faao_AQgbaPj8zjWnfs.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-M1Xlgmj4fYaDw4vf5eW8TvvQXedy9vRJdhMSgk_HK-I.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-wF1he_YGzIdXcOn5wnNusVMIOyKr77MO9QE3mkxqYXA.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-EA6Bx2Wm9qHzYtvFyP0FO0l6WKJMwlhw18dD-KJGrio.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT--_dhmW25gaZ-sn9N3eyAWmRYF_uWp_iu5a0va4ke3_M.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-uFh5njE1uvatDRZAAOEUUyoogVW6YeajZay1fmt5ffk.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test938548984134394369.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-njmWkOnvRTbGMWsv_riMw10P9kC5Kcp0CCNSjTQldVc.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-zU_Nte5CmvaGnyK3EOj9-Ev5-boPwTGQODKpzcHkiAM.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-vAIh5uhem8tlkKUNAurxGaW--8SBJB1NqHZ1RETeANE.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-3eyP4uvZXem_quC_eHGscqiSGEE49FZ_SAl1-ozulDA.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-lH2lK4b7cIKtYY4li-Mh99XRN_VtpQtjKBBdMkF7diU.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Z7FPeVS1hrLopYsjuCkvP26ft0TeCDP2_eRoEv-C1UA.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-VJUYsmhTCaQHyLW6GhmoqakNN7CBTrGQB-TMmxOEu5U.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-CbDekhj_fiF8MiKqFme7oSq3sHrs5DvRd6RFduYgW6Q.jar
    Sep 15, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-DWOwOxlCLA8uVOImAgo31751fUAn74F-vqrrW3eV45s.jar
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-qCuE4txoHbgy3jys1QTh2gYIm7Gu66I18w3iHDAq8iM.jar
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-GK9DLiMOvkLmalYkImnFoHPx_JsD45-rgSIAXq314FQ.jar
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-s3utP4S3dlHwqdgTxAXfkmy_L_0TYGaKiOBhs6VsN_Y.jar
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 192 files cached, 27 files newly uploaded in 0 seconds
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95444 bytes, hash 181f6574771acb7090353458b9ca45d17d37ce3e6816dff8f9710055d09866b2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-GB9ldHcay3CQNTRYucpF0X03zj5oFt_4-XEAVdCYZrI.pb
    Sep 15, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 15, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-14_23_45_18-8199746134253764270?project=apache-beam-testing
    Sep 15, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-14_23_45_18-8199746134253764270
    Sep 15, 2020 6:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-14_23_45_18-8199746134253764270
    Sep 15, 2020 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T06:45:18.923Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 15, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:25.833Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:26.579Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:26.620Z: Expanding GroupByKey operations into optimizable parts.
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:26.725Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:26.898Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:26.932Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:26.969Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:27.004Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:27.461Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:27.544Z: Starting 5 workers in us-central1-b...
    Sep 15, 2020 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:45:54.911Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 15, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T06:46:01.356Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 15, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:46:15.723Z: Workers have started successfully.
    Sep 15, 2020 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:46:15.754Z: Workers have started successfully.
    Sep 15, 2020 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:46:47.551Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:46:47.684Z: Cleaning up.
    Sep 15, 2020 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:46:47.757Z: Stopping worker pool...
    Sep 15, 2020 6:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:47:39.634Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 15, 2020 6:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T06:47:39.671Z: Worker pool stopped.
    Sep 15, 2020 6:47:47 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-14_23_45_18-8199746134253764270 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ba675ed6-d009-4d58-8184-4c700504327f and timestamp: 2020-09-15T06:47:47.205000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.039

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 6:47:47 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 42.744 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 30s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/dvieocot2yniq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #996

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/996/display/redirect?page=changes>

Changes:

[Boyuan Zhang] [BEAM-10505][BEAM-10530] Add truncate capability.

[sychen] Add max buffering duration to GroupIntoBatches

[noreply] [BEAM-10252] Add null check on logical type override (#12831)

[noreply] Updates Dataflow containers used by unreleased SDKs. (#12833)

[noreply] [BEAM-9615] Add initial schema proto documentation. (#12553)


------------------------------------------
[...truncated 287.66 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 15, 2020 12:48:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 12:48:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:48:53 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 12:48:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2020 12:48:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:48:53 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2020 12:48:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 15, 2020 12:48:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 15, 2020 12:48:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 15, 2020 12:49:03 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 15, 2020 12:49:04 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-bpnVUet1L2gi8y3_Ydmhi_7xMzqW74CuDhFJEjpccgc.jar
    Sep 15, 2020 12:49:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-TdwaPIXn7SwH5RqnYQC6wXo9pJcw2NrHHRVH17RmOH8.jar
    Sep 15, 2020 12:49:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-6cTrVIpU9Wq-CtHTPJC1E5XZePvzVWCriUzYoy4PJk8.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-GBZEkGS1JZoH4PsRV4WiM343J3CMStXhqa_Rcr8sGVU.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-fRrEJOk1YO5QV2MXdRBlJ1LhlwxEJFbLDMeK77vcOOM.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-iAnCK8KCE204TLvrv7KfEd-9ZlTF70gH-ia2dOzeRbg.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-PAJtHnKALAjmgkVWDuDFrcA4i_uKuf-ZAnVP6EexOOk.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test614034748031506813.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-DvynTGUUy4AxAeI2Klx07IAJGHdbslHqjsOQeQ8kL-M.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-ddyTG-hsKIapOx6YESByMd0e1WaWHf8GFZr08KqG7HY.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-CHlu8ux0XcJz6rUkNQMclTP1uCBZyxw3jSZUSIVgQY8.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-5oDDjKrRd963w2-ONnGSPkq9HCWsxMvwlvCHwU2QtHA.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-h13F9d2yuRweXOfz4fsu18Flaq42ELztGFiWYM6o6IY.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-KJy7QiSYQ6vr2mj_FdJKKRQWOyWBimrJHm6_XI2YJPs.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-bpnVUet1L2gi8y3_Ydmhi_7xMzqW74CuDhFJEjpccgc.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-bi2BMEIj7PFXRJjIcKg4LZrDdEt_-l1d22oyCPGHjzc.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-2Z-UCSTeqUF94QQQWu2afu8ywoOv2FijecPtnX-LNkw.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cIxsUhg_hpbWGuTX5Wn_i9olLHLDlMCRG5lsT2uXboo.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-I1VK8wyM4pZCTHxcqKjlfIP1_d-DTF8nS54C8HVrqCY.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Vvi0Xci_EyQiBsuuvLduezBHmS2tQhvR7ARJMVO4SpY.jar
    Sep 15, 2020 12:49:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-X_8M_xUvEumode6m6gLbERaMxghvlQ6W40549byZ9zY.jar
    Sep 15, 2020 12:49:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-PFsr9f1-wh8OsCbIn65Yn2I4rJIPbj6-d6IoXMV2UKA.jar
    Sep 15, 2020 12:49:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-Dcg9MEyeaPYGYEu8ir_ao3mMj7t07PFITGH0YBQnlVM.jar
    Sep 15, 2020 12:49:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-0KPLn4w3noOcD03rNEVNkrAjN5DisZ9CShdbtQh9KmI.jar
    Sep 15, 2020 12:49:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-BE2NH_nwq6CaQpVp4wOj_Ig3_Li4SLgPVf2hbi9xklk.jar
    Sep 15, 2020 12:49:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-etoX5NfcbgP1WCyHbjnGGxa_66K1_XnUQ1c1zzx8Qxw.jar
    Sep 15, 2020 12:49:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-0sh0Ps_XRGaFSZgTcKnY6_VDsh1FGGvQnzwauqHwWTM.jar
    Sep 15, 2020 12:49:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-q5oL7thRYgSraII0xpTwlPhdZKMllozIHQo_3K34zto.jar
    Sep 15, 2020 12:49:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests--JXO7zLY3l_CT_T44L69NcmWNdZkojgGug21TU6YT9M.jar
    Sep 15, 2020 12:49:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-PytO0hTU07CIfKpcEJKkdjyw8ZXDRVsoGaFe8kL8mqA.jar
    Sep 15, 2020 12:49:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-nl5a7ZCZXdy4dOeXceoz-qTguAVji7AoVqLeLrztZ-Y.jar
    Sep 15, 2020 12:49:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-59O6PWlce2SSOR0AJPHCcH_wrY_u9_FAs-NSGfZk57c.jar
    Sep 15, 2020 12:49:11 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 7 seconds
    Sep 15, 2020 12:49:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 15, 2020 12:49:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 15, 2020 12:49:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 15, 2020 12:49:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 15, 2020 12:49:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 15, 2020 12:49:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95444 bytes, hash 8acc55e6887ade0403ec5aa066b4160f2872a3fd87b85319cfbecba32bbe5448> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-isxV5oh63gQD7FqgZrQWDyhyo_2HuFMZz77Loyu-VEg.pb
    Sep 15, 2020 12:49:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 15, 2020 12:49:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-14_17_49_12-11314874561782367437?project=apache-beam-testing
    Sep 15, 2020 12:49:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-14_17_49_12-11314874561782367437
    Sep 15, 2020 12:49:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-14_17_49_12-11314874561782367437
    Sep 15, 2020 12:49:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T00:49:12.938Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 15, 2020 12:49:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:22.937Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.109Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.163Z: Expanding GroupByKey operations into optimizable parts.
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.191Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.265Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.294Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.326Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.363Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.732Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 12:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:25.798Z: Starting 5 workers in us-central1-b...
    Sep 15, 2020 12:49:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-15T00:49:38.701Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 15, 2020 12:49:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:54.326Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 15, 2020 12:49:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:54.403Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 15, 2020 12:50:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:49:59.853Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 15, 2020 12:50:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:50:17.008Z: Workers have started successfully.
    Sep 15, 2020 12:50:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:50:17.042Z: Workers have started successfully.
    Sep 15, 2020 12:50:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:50:50.312Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 15, 2020 12:50:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:50:50.458Z: Cleaning up.
    Sep 15, 2020 12:50:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:50:50.552Z: Stopping worker pool...
    Sep 15, 2020 12:51:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:51:40.809Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 15, 2020 12:51:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-15T00:51:40.865Z: Worker pool stopped.
    Sep 15, 2020 12:51:49 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-14_17_49_12-11314874561782367437 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 25c1ffd3-4970-4234-98c9-34f63866e5bc and timestamp: 2020-09-15T00:51:49.186000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.742

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2020 12:51:49 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.058 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.081 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 3 mins 24.535 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 5s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/oydc4ulyp2tci

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #995

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/995/display/redirect?page=changes>

Changes:

[qinyeli] Removing dead code from Interactive Beam.

[pulasthi911] adding twister2 quickstart

[pulasthi911] adding twister2 documentaion

[noreply] [BEAM-10463] Fix minor typos

[nosacky] [BEAM-10886] Also publish build scans from Github actions.


------------------------------------------
[...truncated 279.46 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 14, 2020 6:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 14, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Gff2SfaJfWyqI4FCc1aunl2cTpu4PI1VslO1VrjsFws.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-fd3evTY5oifDUVeSaOlV4BL8IYP0uu5taEDpNNKdB3A.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-9E_vyQI09fTU0d29wUoONDyKGrbEiXoavf7aRx9zF5k.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-K0wqBGcyY46bBSva-wHL5iQjYCHBgDKqIJ9qTaJhtQo.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-EZKV-byY2feyfSg_4yCb6xYRuyGENWo1kYFXMLH8VSg.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4445160172037073909.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ir-EVZavJqUEqsTD02seIpBtZ0x3AYYu5LFYCppx9l4.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-RjHmqByzOKXc9djNjBhIgxGI24TvDqf43PQIEU4eJuk.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-aj7MKwWJU-IASe47AxUgfjPGv0PvpzfmrV51kzvZKUo.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-jWnsEvZzhpFlZ6LgTlmyWLrgvgKot1TnxWvHMkLjlK8.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT--3KFBXiX-JsrokjCauimoB893cO4_UdKP8hoPKiWXOc.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-5c2oB6MdnRcr-IRjs3UOiTDiQmBnXrfq1XOwthI6aHM.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-tvekpeFJPJy4ScFY99Qh5r-qMgJGmSSv0ayKaBmGG88.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-W42A8nyb5YbfGGL-JjU0pNWtAqdOa0y83udcQhRrqMo.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-aq7LqsFmHj851jLYZw46oUsSYVEvuU_0-HZVP_TzqSg.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-rZNdxYu4Su1qUEJQ9r0b40ptejo5OURgW2T_2rBifm4.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Gff2SfaJfWyqI4FCc1aunl2cTpu4PI1VslO1VrjsFws.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-FATCX34Tdd10TlSsHH2DIfvVsOpPV2DKQLko7TXDLEE.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-j2fHYhOWK1YrtrTEmlYX3McZaQxxWCigb4XaIrEgp3g.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-kdj-1twrmLgUSU8X1DoGne6oTLs9alRnE31wPZsLp8Y.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-xBcW7Kya05FCf7WAWNOZUhURhiFKccpXTiALrT0OgaE.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-JxLiB7qB4WgL3pO5RyC-_5BTo88b00RwyKQi3HLgSjE.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-pv-G1BjhO8IV5fNm962uYRRM3SZmaDacJC5qyOcbHMY.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-zXYWJHvJW2ecotWWHqLPrWGtQ0J5jrlea2qyeI3_Pxs.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-rJLQ5NQB1HuBg-4Qzqm2uAn_mCUlmXCAsuYCoz40cjc.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests--FXOqtJfto7tC0TKRBdVq7CCZGh7qg_zfj8RgrOkQY4.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-AqQNdsIqC96STfaD_nEuUU5VkcqEE3xDZaxLofbEBJ8.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-DDH5wVsG9_UryBaH5Yew1GkymzfRqUe2NK0FML-PLOo.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-HoX1xb0jPRBW9Ns2pGsAgrgPxDJ1L1bXhce2p3J2mjY.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-Ryb25v2NccvnnjeKC6AxSQXiijU8y5D86XwkRVBDPdE.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-T_KQdeASX1LVkikktpO-LJ9IdxWy0U2YVBaaFUsNcoU.jar
    Sep 14, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-TwxNPXNqHPDjESCHAhKiIs5N6w9AumHGLQJ79nEN-W8.jar
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash f9fb95a53c419314816e297f8d763209f4177e19750e7d37e0b25a59611d9bcd> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--fuVpTxBkxSBbil_jXYyCfQXfhl1Dn034LJaWWEdm80.pb
    Sep 14, 2020 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 14, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-14_11_45_23-16211709926831709665?project=apache-beam-testing
    Sep 14, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-14_11_45_23-16211709926831709665
    Sep 14, 2020 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-14_11_45_23-16211709926831709665
    Sep 14, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T18:45:23.888Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:32.281Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.263Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.316Z: Expanding GroupByKey operations into optimizable parts.
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.356Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.425Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.453Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.483Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.520Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:33.952Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:45:34.018Z: Starting 5 workers in us-central1-a...
    Sep 14, 2020 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T18:45:48.658Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 14, 2020 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:46:09.919Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 14, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:46:17.741Z: Workers have started successfully.
    Sep 14, 2020 6:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:46:17.773Z: Workers have started successfully.
    Sep 14, 2020 6:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:46:57.387Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 6:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:46:57.555Z: Cleaning up.
    Sep 14, 2020 6:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:46:57.648Z: Stopping worker pool...
    Sep 14, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:48:05.733Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 14, 2020 6:48:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T18:48:05.777Z: Worker pool stopped.
    Sep 14, 2020 6:48:14 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-14_11_45_23-16211709926831709665 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 34ff9828-36d8-4bbe-b18e-ab6130313b83 and timestamp: 2020-09-14T18:48:14.218000000Z:
                     Metric:                    Value:
                   read_time                    21.794
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 6:48:14 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 3 mins 5.783 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 58s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/bcvccvp6tbbiy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #994

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/994/display/redirect>

Changes:


------------------------------------------
[...truncated 281.23 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 14, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 14, 2020 12:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 14, 2020 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-CxpnhIkJSbOTB8vk8xJOsRwXSgpSu8W7LbIEK4toCTA.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-DzgoqJHqh4GW59AsaVySJtXehj2A6CjeVkFgrPQsL2g.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-39Wa-CWCsU2du0C5-vLVUT3rUKfZ-rlv21S9v3WO-m8.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-sLfGWJzdJEMnknSnoYzAlub7Nj3OTXXVCs08rBksy30.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-azRmsRR6AdTU3ho54SvOrInjRv_HSP7SmkYU1ZxCIZQ.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-DbD-CQ_ZZPY74RHvm3wOXAezEvjfBVAgFodU-sI6l_E.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-llJggq3rB-e2NSRVq6h7Ry7ToR8fs_Zjolb-8N_aADU.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-kuEBW3uBDIxv6KFmSbqje_2-FWKFUTM5fB5HA3myAOI.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-CxpnhIkJSbOTB8vk8xJOsRwXSgpSu8W7LbIEK4toCTA.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Uf6vrC7ncpZ_2q9qQyRP0tKYSYvxqr6sA6YG1jNB18I.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Iednw9vCH3TnIoYtJkUJ1Uo8fWUddjydJsr_MIs_3AM.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-h0pvrK-Aw8orC5WlxViOnR-xpLZcGsNERCveOOBY9K8.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8990540214586028926.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-LzAsT6I3jMDFJo-s9GwEgpsDtJRUOd-FIaa2x9mchgA.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT--o18nBfiOehUHVHwcS-tGWiSus-xDKR-MFwnYBxLmPY.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-vQXf0kFtaXlzvubwa6SKHpEsshOiOHe7C613SCt_BTM.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-G63_nmdO8Dm5No1TmjHwki-0-dsiUvhG2YE2GV1mUVM.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-FtNOZmGDuCtTMlw1TAZ2gg05GyjWuHIMYltYQuxSREs.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-ndhjz73sDx0-_-63Dtg8BqIA9Sw1bCQWzeB2aJYmp-E.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-4DFWgJMaOAZz8QU7aMS9I0eKrWB8WNtOYsKfCWRCLs0.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-u1y-L1iWZ3Zf-lUg7zx2HOEDizUPBWmOqg6p0T4gdP0.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-qUDJkWXrwbtn_nSfMipRq9ySWbuaNIVKCAwr53YuMyQ.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-xSBDtfxYCJ4vQhwnG12lbwluBmeT__2ZiOm32cCFmcY.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-vcBy8cQP-RpphCAK6y9Dti1btBPCegXw3v9DsYgbVKw.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-PuGMbzem6fWB_X7iCsEodbK1RqaTjqBGyPFjMsNWaYE.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-43n1xsw_9j1-J04CQt2JsMPs6qpqDUl6MZvbKsIimq8.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-2P3eAK4LiRpKuWR_XMbVhpYrlI7Vk7tkicwXhw_OOjU.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-bN6OY1lgBogbtXYcI0G3Xad_G_ndVm56msDMONKeEzg.jar
    Sep 14, 2020 12:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-lj8F-Vc2me6g6BR0iPYqDUEPSwMbmCXTmw50Y_1g64U.jar
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-UpXRq82RJqjZCxQeGgShVLT4TOWVt4Dmk9b5db9KCos.jar
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-Z75leKgHkjdS7V1WuKQuaUKIbxxF5Uqko3WEEkvyf-U.jar
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-vyNJxwxjFRw7bBPlVpq440HMe6mbxMP4Rxm5ep7id9s.jar
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 14, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash 1bea30a6268920cbf9a870e547b547b77182c23db4665c12e049b8b02238bf1f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-G-owpiaJIMv5qHDlR7VHt3GCwj20ZlwS4Em4sCI4vx8.pb
    Sep 14, 2020 12:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-14_05_45_38-2637743576305468908?project=apache-beam-testing
    Sep 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-14_05_45_38-2637743576305468908
    Sep 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-14_05_45_38-2637743576305468908
    Sep 14, 2020 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T12:45:38.131Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:44.637Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.405Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.444Z: Expanding GroupByKey operations into optimizable parts.
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.483Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.561Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.587Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.614Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.635Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:45.979Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:45:46.058Z: Starting 5 workers in us-central1-f...
    Sep 14, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T12:46:11.460Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 14, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:46:11.862Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 14, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:46:11.889Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 14, 2020 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:46:17.282Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 14, 2020 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:46:33.064Z: Workers have started successfully.
    Sep 14, 2020 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:46:33.102Z: Workers have started successfully.
    Sep 14, 2020 12:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:47:05.621Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 12:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:47:05.759Z: Cleaning up.
    Sep 14, 2020 12:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:47:05.847Z: Stopping worker pool...
    Sep 14, 2020 12:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:47:48.217Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 14, 2020 12:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T12:47:48.268Z: Worker pool stopped.
    Sep 14, 2020 12:47:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-14_05_45_38-2637743576305468908 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 18f7b8d1-72ce-4bdc-8ff3-7d1cdb109550 and timestamp: 2020-09-14T12:47:55.482000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.372

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 12:47:55 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 34.229 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 36s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/miln4ortzmdoc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #993

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/993/display/redirect>

Changes:


------------------------------------------
[...truncated 279.50 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 14, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 14, 2020 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 14, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-2TiDJRJqpZftx7W4o3uXh3T7LNchuzA0v1XoGGUiM78.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-hXijY5fbGHUzIPuzCtrVwXrlmpRBcOgR1JhqXVeyCWA.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-nKbn_c_c0dx9jPxNJHBcKFCEPeklLSdupPTiPkEM0Uw.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7357824017053594814.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-U2NJ_70DbSif3JGEla02Ye6-Ua-lXoSugNUu2kl9NL8.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-P6yMXXBewN00uFAF86TK2WIVyv-NBkrxXAufpoug3Tg.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-nW-vJAdAuY5gwOb5m0A9jg4WVutC-6-bFlzVVFl7Owk.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-CYEKkTqfjLnJ5LJMkn6ixYqLijv0oQz-Guyl-8nTkfU.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-VZ1PTDAm8CZS3RO9q8ja-K4yhwf9mk-wPdGaj29pLb0.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-HcrTDY02lQ7pGWnUnZTy9qt28frBR1Evop9Fk-sbOzw.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-gCxbzM7UIWmBWbh8DoWZphzglpNQ3Q1y8YDTyxIHr_4.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-j96WeB2aX9urpVdK2rLhTVcLBoyKkDvOoTGiOea22WU.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-12VKpNDhgubLbwr5eoFB-WNco2kBEdC08MeVxr3YqBk.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-g6h7cMG_smhszM39KEZYSzC-kBJtiFweL4D6QLx03zw.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-2TiDJRJqpZftx7W4o3uXh3T7LNchuzA0v1XoGGUiM78.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-m4ErMyNjgI3f9eci8fv9NSOjqG0w5J7NV17rChHivEI.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-oVmfMM3kpJUAgek8fbD35StAMBQXARleDHMqtQj-mNg.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-DNUWShArr-N9yjTruHfqR0BTqxAz6qgIgVBtSjSLm8U.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-fTdaIsGPLtOdNpXs3quJ0UqREGBne_siRRO3SOIkvQw.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-0wKuAM8TPU3Ve29H9125xvS1VfW3Fw8JwjgRIjahlms.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-z450xTt3aAd-l0IWgpMT3Ok3-2EnF5zjxNC_4XX0WPA.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-fOVdCzzyP4KWV2L-rSKbNTP1vSg9DgaWp-jdvTm-_P4.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-ySpJiGnYJLP2CP4EfTEJiQJwQg7CjCXUzHG71Tcbqus.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-hDTtL8yVb2Gt6LlSxqNxZprawGVgGcjqryMHxIL-P0s.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-pnYMXou3b3FFmFvg3R25b4O0ZnKAkzih2JX5pRO6Duc.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-d5pOGQ7BbfbwUi4ZZFBCH7tqQcVdjjPRC0PuuQnUvXI.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-50YPx-z-rZ0L5zUu7RSNllS5pI426Ze7L1D9LfhihzI.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-oZiIbXq5H-sB1WXndDYE1nzysHMX5BjcK4PlNcM63mY.jar
    Sep 14, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-ZhKm-s6R21aEJMvHewBhGvCaLtMAp-2IM1saEl13sU4.jar
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-DbvvZGMFwkwawilRMGeZbWuLTvx3Fu_rrC49gaXchtM.jar
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-R2XJ2Ux5Y6pIndbyvR50HCIOXYhU8YE4IZWpi8ui_Xg.jar
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-J5kMTDV1lR-RsQQZE8MyzO4g8wJkniMWP2cC8KZa25c.jar
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.68/9e3d29f05bcfab1c15a1357ebf2dd513c1d42f49/fastjson-1.2.68.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.68-cGrbCezeeBQfDPJGWh6b307ug_n5g8_BYqWhckhy_rs.jar
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 188 files cached, 31 files newly uploaded in 1 seconds
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 14, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash 364a3735579105dcfebf2d5b44e665a34aa0492a3e6ad78e08c1b7bfe805bf82> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Nko3NVeRBdz-vy1bROZlo0qgSSo-ateOCMG3v-gFv4I.pb
    Sep 14, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 14, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-13_23_45_17-11896394134789602595?project=apache-beam-testing
    Sep 14, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-13_23_45_17-11896394134789602595
    Sep 14, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-13_23_45_17-11896394134789602595
    Sep 14, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T06:45:17.132Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 14, 2020 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:24.884Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:25.770Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:25.811Z: Expanding GroupByKey operations into optimizable parts.
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:25.841Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:25.908Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:25.944Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:25.971Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:26.009Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 14, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:26.489Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:26.587Z: Starting 5 workers in us-central1-f...
    Sep 14, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T06:45:33.960Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 14, 2020 6:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:45:59.124Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 14, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:46:14.950Z: Workers have started successfully.
    Sep 14, 2020 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:46:14.990Z: Workers have started successfully.
    Sep 14, 2020 6:46:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:46:48.302Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:46:48.426Z: Cleaning up.
    Sep 14, 2020 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:46:48.501Z: Stopping worker pool...
    Sep 14, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:47:35.627Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 14, 2020 6:47:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T06:47:35.671Z: Worker pool stopped.
    Sep 14, 2020 6:47:43 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-13_23_45_17-11896394134789602595 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1aa485cf-57b0-4163-8c16-e91f7f519daa and timestamp: 2020-09-14T06:47:43.831000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    13.548

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 6:47:44 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.017 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 40.545 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 27s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/qjqtdtj2f7uww

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #992

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/992/display/redirect>

Changes:


------------------------------------------
[...truncated 279.38 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 14, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 12:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 14, 2020 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 14, 2020 12:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Z9abmQLtwvYbNOhNIXYBghW-wiBMzrvqUO6VlA0Ftys.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-mGghy3JUURVt7jCOc31uUzyK67zTJ9ejarngJqkoheE.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ChlpxMrkAx94DNYEGa9mjVwMYCSScYiTfa0vH4Jeh74.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-6cMjb4xH8S7Y7c0m3dfjANR-_OEGKeP_5Zz5te_hZ0Q.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-6NDgkGbsZHEN_Mfaq0Az4Lf0HkrWfm-cCVdFFV5SxBs.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Z9abmQLtwvYbNOhNIXYBghW-wiBMzrvqUO6VlA0Ftys.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-oEMuUZZ7sYEqd0cJCdHBjpk0Z5XgaVpN-jMp2z6zJ0E.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-zTbek3XabzrNRI88ki6k3uah5p8PAgK2RMybkQzYO4c.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-pl57gdIyXPnk9nyxKT1nQTmEYIQRDZQAte-3DrJ_UeQ.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-QuSWX2PtjrDwjZHTH0MnDvLvtP2syBwtQi7bdieJI6M.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-xbKAVgSdWCwZJ6o18e1ptg_8BfD2rk2ZN1OqKL7wMro.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-rGBqcJ80l9wxbIBHqNqbpQR8CJ0VqFCptYKRn41CR8w.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-PSiSUCj757Q3kN97593dEm6Whg2Ol-pnT6FoMzFgwl0.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-hOm3pTfVM2jh1mI0bSy65Y9iKl3dyTg9Tc7ipwg2Xmk.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3906649559536095733.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-P43mHstF0Vn3cOMxP4q8IPMNodbniT8zVHQfbu6cEOI.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-Ze6CBFpEjaqYvLJMhLlOQ7lJbdvxpo4Is8y1dF4IO5I.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-xDIuE4m5k8jS48CHFZETDBCD6Nf0MfwqoWmglXDwvNo.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-aOklLejTZAwn-uulKLnuuUw2Sy9qBf5ci3RyxU4T6fw.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-5XrKoqwKsI_FVhoZuQ_wAL9CRFinPXoRC6r-g03NxZM.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-9mUEZaAXKhwFHrMXEqd4-tLTFSP4s9YsBXpmAnxukz8.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-hCEV9o-NnwGsUOgkYOuQ6roWsEIfaw-bEAcGkPJLXl8.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-cazJrUePQToOjy9u6d633lx67brCGD6CKm3nzpohsOk.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-phpMU5TwkEbn6J0lmHY4pxhdq1rlmvc-BYE7d8vsDO4.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-L_16y7CEte27wAOUGfsHBSUzY1-6zO58nb5ygjiNTsE.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-rdVX59bABzMJvY4BtQ44w6P2el5qNUoNCgpCM9p_ftU.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Hmc1BTyhXc8sO1W2ILm7Iap-RSnW9feRfAtUzJvbmTA.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-85k9i1FCp7eey_VhWK15oVX_lWtftzEBEQn1zQr8CPI.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-4Dg2bR9-Oz9F2IwHrTjzCpg6iETVfiPxkuaI_lV82nI.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-nYZSkNl_7oDeDt9LR318__Asx7I2d0B2ILh_wuJ4I88.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-8cC5rTF0vkVmGbwfccRV_LWLEuzP_Us7R5VvlXcwHkI.jar
    Sep 14, 2020 12:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-dTKXrzh50rMFiCJWZPSGa82rIj1EF_8rmpCWOT0yseo.jar
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95396 bytes, hash c727f1f8c66e1a5dd75af46828f28b374733ece84b418e0a9776a8939b947216> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xyfx-MZuGl3XWvRoKPKLN0cz7OhLQY4Kl3aok5uUchY.pb
    Sep 14, 2020 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 14, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-13_17_45_17-16012402092642319127?project=apache-beam-testing
    Sep 14, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-13_17_45_17-16012402092642319127
    Sep 14, 2020 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-13_17_45_17-16012402092642319127
    Sep 14, 2020 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T00:45:17.703Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 14, 2020 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:26.616Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:27.812Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:27.850Z: Expanding GroupByKey operations into optimizable parts.
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:27.882Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:27.958Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:27.986Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:28.010Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:28.032Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:28.402Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:28.482Z: Starting 5 workers in us-central1-f...
    Sep 14, 2020 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:45:56.229Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 14, 2020 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-14T00:45:59.815Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 14, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:46:14.143Z: Workers have started successfully.
    Sep 14, 2020 12:46:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:46:14.180Z: Workers have started successfully.
    Sep 14, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:46:45.899Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 14, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:46:46.048Z: Cleaning up.
    Sep 14, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:46:46.123Z: Stopping worker pool...
    Sep 14, 2020 12:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:47:40.149Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 14, 2020 12:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-14T00:47:40.190Z: Worker pool stopped.
    Sep 14, 2020 12:47:48 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-13_17_45_17-16012402092642319127 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 74a5efb7-ef22-4325-a537-fad5d78d2688 and timestamp: 2020-09-14T00:47:48.145000000Z:
                     Metric:                    Value:
                   read_time                     14.11
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2020 12:47:48 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 43.721 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 31s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/ad6xihyqivdxc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #991

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/991/display/redirect>

Changes:


------------------------------------------
[...truncated 279.77 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 13, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 13, 2020 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 13, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT--RpzHxJNG633hQ6lRU8IMc__-srXDPTr6Rxi4jee9h8.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-PklnGxcoosB048fUFrg6Pbf7T7Zg_xxzCCuuvaWvwYg.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-BOzzy5YU46WU9h_aTmD60nE1tXaqhF4hWn7-3CsH0M4.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-CNV8Py6ecNrsZ2WUhbIUR_T-BFSAAyTH1BUqhP5qVL8.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-Eu_puBqV06gF6n2ASMn5wApMPmi_DRH-LWfiyuBJ_5c.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Ab6Pz8AXfFndmiF3Wf2ZOZt-6_0CfkLJLplPKRfBf1w.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-aFagGdf-53qDJRsJ05-i3RPaqBb3d3kZegSkVLyT3lk.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Sv5_gYVXrO5MmpwMDIZd1XpwjGWmX-uj-siJzsw74GI.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-dHghPk7Snrfq_KwyLs9rEeqe-rPundgfZ53e2_HV_hY.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-FcosGwIDcMKy0XNH5f-jreMUEIL60-A-W0Bdd_LnoA0.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-cXjTZH3YkaOuY3-N3LVQ81eLTLLoa-O6D5TXISqTGEw.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-cynQExj2tz6Uo2er7EKARxJdUBqOjyuVJUOrF7cuMig.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-32eFoBJF9kHop5CV8z-801haFO4kv9wNwwmyvLFVgfk.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-b7x6rwUslYBlnY37Lh5qFHdeJ_gq320FGlrUnemcVi8.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-JQNVPwDcYCxln-U3v-044_i-1eDquZRl4LI19gZY2EM.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6913594441642094934.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-dDUwAcsRDYpGgw3d0lyNQt2hwea5zLQFCC9PEQlf1DE.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-5VkQ46ypHk9hNLTtPC2wjTzWt8_XmXBUDzPlV6_RGvw.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-su5ZiYSRQzHnXfVbGzPSeZ-QOXT9pj1Ic2h3yAsWYfc.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-lY3kDUWHr27CySwduHqc_PVzo8kq95JykciO6Qa5reo.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-_0krzO8cyf9EKOlA0Mmq3kQreYPa2GefoQKsD13-BVQ.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-HWK4KWxMIxFCn9eaCr7T3bfJM_k82njFVaKumPna4v4.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-CMd6Ksef87T_Q7XCIwZ-oEqQOaZFRL7eNcZyAhtn5eQ.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ezEk5HRzOOmOavPfXYvA0nXgkAvO3-jKaA8V-nvHzEc.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-7q_bbIubCLRw3F5to4-fk-DhCxNqwYlqBss0tBFmnnc.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-u6qyT123MU1fb91tJ-dKhgka2dMzQqlw1JGRFrKrxH8.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-PiXEQwkXQL7xEDkaXVQXPmakGnwEDnXIF19HG371ptU.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-BN6axB0kPFfH8K8O8G4a0QgxMT7Bx4upohImPis10Qo.jar
    Sep 13, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT--RpzHxJNG633hQ6lRU8IMc__-srXDPTr6Rxi4jee9h8.jar
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-Tzpv0nKw04enEscH6qF2Gx_76_GAjdgS2MUsWaNXNS4.jar
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-uA1jtCw2-j-QC_u9KywFWq3da5I6ROmRhgxLJ6QsCdA.jar
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-AZEDRqMjLGsJadTUox0-ZjmzMxyNcDMgVYyf_NUQkqs.jar
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 13, 2020 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash 665fdec223cbc7d44d8f9714488dbf3afa7e360047bc0ec5c3a710d3b68e5e7f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Zl_ewiPLx9RNj5cUSI2_Ovp-NgBHvA7Fw6cQ07aOXn8.pb
    Sep 13, 2020 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 13, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-13_11_45_21-3384341224961832294?project=apache-beam-testing
    Sep 13, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-13_11_45_21-3384341224961832294
    Sep 13, 2020 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-13_11_45_21-3384341224961832294
    Sep 13, 2020 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T18:45:21.190Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:29.584Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.202Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.266Z: Expanding GroupByKey operations into optimizable parts.
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.300Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.378Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.403Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.438Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 13, 2020 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.480Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:30.938Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:45:31.018Z: Starting 5 workers in us-central1-a...
    Sep 13, 2020 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T18:45:39.745Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 13, 2020 6:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:46:25.387Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 13, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:46:44.646Z: Workers have started successfully.
    Sep 13, 2020 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:46:44.682Z: Workers have started successfully.
    Sep 13, 2020 6:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:47:23.129Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 6:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:47:23.436Z: Cleaning up.
    Sep 13, 2020 6:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:47:23.609Z: Stopping worker pool...
    Sep 13, 2020 6:48:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:48:16.121Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 13, 2020 6:48:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T18:48:16.191Z: Worker pool stopped.
    Sep 13, 2020 6:48:24 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-13_11_45_21-3384341224961832294 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c31e2bfc-d2b3-42c8-a49f-5029be3e958a and timestamp: 2020-09-13T18:48:24.184000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    18.321

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 6:48:24 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 3 mins 17.509 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 7s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/7ilcfhmp6ibyw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #990

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/990/display/redirect>

Changes:


------------------------------------------
[...truncated 279.60 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 13, 2020 12:45:10 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 13, 2020 12:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-ul2meUH0IQCe2yUPARZ6hPcjIVGt2QKcM4xmXmDueEo.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-LC12r3Miotzk70lOxfN0D_fbyreq1hCPmYjAUyycJCA.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-H1Zn7zuNlETPuSOLxqbd4gYYUabEr_BTFI3jtArKaA4.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-GOC3k4S4HIVVvKfFTIYluNZiHwYHFQX2oX2WMoOktcA.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8758513969827789827.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-dZKuIgkJYSQslwOIbML-0jPQ7MZ-K1imX3wpxD-UrPE.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-YnzkouMhxevqpL8dV1KEUgVV-9LIJBWaUyz-w1x8y0E.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-QZkdnvL6pMPy3zZC_umkdrs3zL72idn6iFgx4hgM-Do.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-tmUh938FfAdy-aj9QOt1hd0msh6bN_M64Nu3WH4Kqe8.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-akLsiCKhG4wbqngMrLIqG0SZsUFEimLyaYRiD_g5TIc.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-rRnslF_lQc6J2u3Wg0IIyoUUH4eC9W_ECzkWe9VIe7M.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-pE7LH1z7fkg0mEchFNv7ADFu8aP8GGKK0veqIdV0ARE.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-6zwVuzt92iGA4LPCXj_imSjAu5Z0aT89BwsqXiK7YBY.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-MTV5SSbnsxpVH7VIDhv7qAyagOArS-sdlxMU4yQ9hxs.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-tABhqM6q3BUbrcutC3OPLVLjuPITAbN0IL5woLb0RvA.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-rE3CMaTjBzCrUI4kBi8XxjP91X04ajRn21VzkMKzfCk.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-gWIbaXmJfyJTXeiDIKZdR436w9haA0YUmDxmxCaXMk8.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-kApSY8Xf2qUr_mVEYOjp36W0SP1mDE3alt82yHLxQoE.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-qcfO1vXyxgrNIzVrOsfI-IytaFmnYQqHN6ELka10iog.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-DqdpSmhWXaOncNwcgEtYj2nKkwH9DK_3BW1HzaWpVn0.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-DhSx4ER-r1klHjNsKasC2L3CovV7jeP4MCsyPcVLeSg.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-8MYvL54LO6j90zyIMck7Jr5MkVG_iiMc48zOzoKRedY.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-8MYvL54LO6j90zyIMck7Jr5MkVG_iiMc48zOzoKRedY.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-q7Rf1zbCsh00synwlzjbAeuBCPiaqJF_CpfuzQJBQ-g.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-k4jYjY6VeLyq7-19rk0ALmbaydU7Wl86yWiJ2amnK24.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-hHJ6mSE-jlLXmgL4fjPKeF9qXDh_078PfqXRYRBF5I4.jar
    Sep 13, 2020 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-bC62r1Nvo-ZmE-1LgPWVhiTyIpb9VBmpnqsAz-y8TH4.jar
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-BtrLVcbVw089KAbnUrAhl4DzziXzdDaWEGRx-BrpBqM.jar
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-hiq_e1FchSEnK3CRRz688NBurhKAR8BRvL0VQPcQYYA.jar
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-1H7AWv5ndMOSuSgMGaT7EKNfVasTDQHiwCm2hn-tkwQ.jar
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-PwVaHEtLfvJlGtCh8EgvNLjFMVV2Dt0yREt8-r6CIE8.jar
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-bHIy6tcmJ50eeeoBxI-11fIy6YsOXXeVS1dVx22-Bp4.jar
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 13, 2020 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash 6cfd4c81e3c87683f6fa8df17a4bdb9d96b2a88efb77df5842a2605e202b4743> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-bP1MgePIdoP2-o3xekvbnZayqI77d99YQqJgXiArR0M.pb
    Sep 13, 2020 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 13, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-13_05_45_14-6935812336081207557?project=apache-beam-testing
    Sep 13, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-13_05_45_14-6935812336081207557
    Sep 13, 2020 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-13_05_45_14-6935812336081207557
    Sep 13, 2020 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T12:45:14.327Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:21.309Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.033Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.065Z: Expanding GroupByKey operations into optimizable parts.
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.092Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.172Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.199Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.231Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.263Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.593Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:22.662Z: Starting 5 workers in us-central1-f...
    Sep 13, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T12:45:40.217Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 13, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:52.178Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 13, 2020 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:52.204Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 13, 2020 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:45:57.643Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 13, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:46:12.580Z: Workers have started successfully.
    Sep 13, 2020 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:46:12.613Z: Workers have started successfully.
    Sep 13, 2020 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:46:43.660Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:46:43.794Z: Cleaning up.
    Sep 13, 2020 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:46:43.877Z: Stopping worker pool...
    Sep 13, 2020 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:47:34.029Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 13, 2020 12:47:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T12:47:34.082Z: Worker pool stopped.
    Sep 13, 2020 12:47:43 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-13_05_45_14-6935812336081207557 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 527837b7-ad48-45b8-91b0-5136175abd7d and timestamp: 2020-09-13T12:47:43.151000000Z:
                     Metric:                    Value:
                   read_time                    12.906
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 12:47:43 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 41.731 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 27s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/sushkpb4mhgdm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #989

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/989/display/redirect>

Changes:


------------------------------------------
[...truncated 279.13 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 13, 2020 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 13, 2020 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-zIV1ItT13P0dLS9ytEImapkMPpr9OUBPOqkpx7a2eP0.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-RW0X05wCqxOYqBX4HKP5HNHVbLGef4e3Fz2Npgh8NKc.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-lQaCV3zUXE9ILN-2VqrHsFj47-zegCG5YShcG92Wplg.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-B9lyWrTPZ4fgHPTj6B_NxhO8VE9mk0s7WSaueS8RLtU.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-0wy5F-Nwxr5Ekm0u9d1iKYS4vsr-hOo-iVR3Ry8NoZc.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-ln8LRBdObK2cYmKGYULZfce4Bl07Rokubu5lbj75Qzk.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5409813205407147310.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-LCx7FKZn8EznAFvuQQbe0NK7ersr0eKz6cXIzmaygpk.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-tHDQaGgDz6XMMKepWmdMGOtRb28LM9FL66waW0bjHOo.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-S4AV1RZD3fUf6EzFIHU-DhWjrkoGa-YV26Rk1CZJACA.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-MBKxaDAaxjOYXbqolb4JvWYkIvOqluiWS_9xbMagIDY.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-THxGf0hcU9WKJebSyDzSLCLA4FfamW479y-j5ddvEpc.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-zIV1ItT13P0dLS9ytEImapkMPpr9OUBPOqkpx7a2eP0.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-afIIvq9JjxkurTfAJsZNhCEG-LQDhnlia41k9mZ_u4A.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-3nrAf-ljc-rQU7m-pndYCR_iyPaDKIvDWAFyAyg2w64.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-tXiipUOYUqJFyQjzFiKTbjNFNaW-w3tOF4byaBE9bFw.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-NskZkMekPglgsDOO0ZpqOHT0vNzRVXmTHvtm6aQG5so.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-Krm9RkdTfraR6NNvJfx9a217BXszqlH5w4aq3l9wG44.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-vMbAs6RvtMCnxOfuqzmo9sPY-sohSNRM8tSWv7_zyYY.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-kb0nlc2IOW72Ejq6pHqzHWHZVq1FiLWS9pwqO9Ww8SY.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-gIu3PxU2NP0fN4HpqSY8YZHClIsX4ADJRJ8w7zoBgJo.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-ZUkscO8PYsR6jfFaWUM0CCCeKGOTnyCWnLRty27U0UE.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-cM5Zu9J6JOYRD-ncyztbRLHunP0LrNMIh-eeH5XSKmA.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-v1qLjiB9b6E3OHkb3NTYwXYVJpREIL-7ksa9-_701r0.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-5veijYU9SXyj_Hc6GwoVCsQwUA1KPIW4cI-KqZQTR4w.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-mATFtbd-r1VcWjhoPESoSgOEVSH6mFMglCuXInDY5DE.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-LDQfxQSwJ5MvLUUi8f30TJWYkWHvEKWwLJAS7-YZSkE.jar
    Sep 13, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-IUz-gmo-5kS2x0dL1lBeCPt0MqLi0P48hxg_dVd69RM.jar
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-vWTfM6-DXNI9E4BXdV2A78ERUXjgbz9qYok7zAxf8Jo.jar
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-s8WyqBV2rTfljzyQcdIU9MQPzwpuHROhzADp-4jOe80.jar
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-0LKQitoIub_1pjxPE2P_RjzeK_HkE3dfHDPfvSbwDOk.jar
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-7Il7zTQpQvycpdio4NCShPIUO1TJBqGIsxehvv3wksk.jar
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 13, 2020 6:45:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash f549fa8c906b1f44b49dd31565360e2dde8d1ff5090edcb01e55c17a77053cce> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9Un6jJBrH0S0ndMVZTYOLd6NH_UJDtywHlXBencFPM4.pb
    Sep 13, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 13, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-12_23_45_18-3668877861938299411?project=apache-beam-testing
    Sep 13, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-12_23_45_18-3668877861938299411
    Sep 13, 2020 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-12_23_45_18-3668877861938299411
    Sep 13, 2020 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T06:45:18.238Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 13, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:24.567Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:25.399Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:25.437Z: Expanding GroupByKey operations into optimizable parts.
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:25.466Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:25.542Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:25.560Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:25.596Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:25.629Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:26.061Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:26.130Z: Starting 5 workers in us-central1-f...
    Sep 13, 2020 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T06:45:50.314Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 13, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:45:52.865Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 13, 2020 6:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:46:09.821Z: Workers have started successfully.
    Sep 13, 2020 6:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:46:09.856Z: Workers have started successfully.
    Sep 13, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:47:44.208Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:47:44.338Z: Cleaning up.
    Sep 13, 2020 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:47:44.434Z: Stopping worker pool...
    Sep 13, 2020 6:48:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:48:37.225Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 13, 2020 6:48:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T06:48:37.274Z: Worker pool stopped.
    Sep 13, 2020 6:48:46 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-12_23_45_18-3668877861938299411 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5ed4b5eb-ecc6-45e8-8325-d9bbd7ecfbea and timestamp: 2020-09-13T06:48:46.413000000Z:
                     Metric:                    Value:
                   read_time                     11.72
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 6:48:46 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 3 mins 41.732 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 30s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/clotlh7cmjyb4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #988

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/988/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10861] Adds URNs and payloads to PubSub transforms to allow


------------------------------------------
[...truncated 283.34 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 13, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 13, 2020 12:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-5diZz_uRuWZG_q1Il7CDozVCAppfQDK8T2dLeFthDlI.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-PsicMQVhUErgQCGuw21uDfF6jSXXtKJaCPYulUA-9QQ.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests--Jt-k7dumamrQmhLJNzXhiJzM73jPyiBuZmL4AG3TPg.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-HtMUjouDbvAU9MXDLQcwxU6prA2Vyr9UUfXXksIHWZA.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-nUzjyJ5YZGFE4qbm2WEJMRKytgV5-d4jTyt9FVA-tis.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-Dsd0oGbV8-T4RUhKsEVhTaUwrFTFLNUBQWai55VKdX8.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-357HhFVGhBkuhVI9RSD0aDbaMfrUO42YKmHKrkW_4Yk.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-LUro0SK7PU48MlBjVGoas2o0EB_-_1SPUok8onnWba8.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-0hQDuHBb-2xa9q6bP3lou2RjfYaF5kKAs-TJNTyj_V0.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-5gOIj_tY1RD9W4KHYQ3ACCMfb9Ib4UfSGk6gyehCUt4.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-jJHXaUhzXCpOHyfbNI4DyNhjBfVHaWI9JDQKcIt2l3s.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-jhEoARraI2zeDO8fF1aHqOWBmX8BRv7sURWX1Viy-Ug.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-qOVRK4QDV0H9PkLf7t5h9rRmwyui9vp4wXJCvxsiqcs.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-5diZz_uRuWZG_q1Il7CDozVCAppfQDK8T2dLeFthDlI.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-3NxfJ64PRjNXPikzivHA4serZNBAB1z763nDQJh68A4.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-VBd__NjrNDUUrc4XIcBPVps7AXs9OjCPTZQshIEJEME.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Znukb20zjBJqH_r62o8nX24I1dEVGLnVN5xjX_zt_jA.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5044388266524388549.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-JwngBD1hPWItBiz9vePm9B41dKxLCFCnrhPiryrL0is.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-5jnrMmvMGb2gqEO3e7N-_TK8GvTM5NbMMV4fF9JOrtA.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-yvs2boNR8423KvnJ3m_nw3CwRl656AeAjRogEgxdU6s.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-sGDEymxeHpIzRhzcgnVrf2l_yUK6DaGJx0Rj5C5YhNI.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Ul447er-TayFIqHXaByOU3N_hWAOjDAzfwk6lKSGPMM.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-FTjHAdyKgcKXZrVK27U0zRldZWqzB_xMaQELN8VGTJw.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-pDDxdYEivLHGhOraPMM1cSTQD443cmIn3jLkmT2lcu8.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-kYeB-lyqjvdvPz_I_8n6FoUfBkXdcMGV6rSpWTz2N3Q.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-Vj1fCqGwYfIbGHsYwYdfilFOv-JU43KwYv2BAOOEyJ8.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-g7OE6gHE8N_YT01ujnuWoM5yS3-CGvVg1d3LQANrMKI.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-QH23r_soQXfmHsUh0FW4MYrDLQwDIIdZ_ch_B7rQzH4.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-YxoWbVyYMdqq2kNFDr3wOLo8wK0E_9LFW5OjgJr6cz0.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-UPRhPAx6KcOdZJQvnAD1DeiAD0V3sXuFDjfaxdBO4Es.jar
    Sep 13, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-hau9eW65Gd8rrwwLwjX0BKVNLVjWylZ4J_1TUR21DjE.jar
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash c6e8a4483f86b0fe17ff7d8684109547fabd26bf7459236e1bde8f74a5b8d843> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xuikSD-GsP4X_32GhBCVR_q9Jr90WSNuG96PdKW42EM.pb
    Sep 13, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 13, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-12_17_45_32-2442042179356823465?project=apache-beam-testing
    Sep 13, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-12_17_45_32-2442042179356823465
    Sep 13, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-12_17_45_32-2442042179356823465
    Sep 13, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T00:45:32.689Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:40.353Z: Worker configuration: n1-standard-1 in us-central1-f.
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:40.923Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:40.959Z: Expanding GroupByKey operations into optimizable parts.
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:40.993Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:41.087Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:41.116Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:41.152Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 13, 2020 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:41.187Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 13, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:41.537Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:45:41.611Z: Starting 5 workers in us-central1-f...
    Sep 13, 2020 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-13T00:45:56.085Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 13, 2020 12:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:46:13.651Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 13, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:46:32.764Z: Workers have started successfully.
    Sep 13, 2020 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:46:32.800Z: Workers have started successfully.
    Sep 13, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:47:02.128Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 13, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:47:02.290Z: Cleaning up.
    Sep 13, 2020 12:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:47:02.394Z: Stopping worker pool...
    Sep 13, 2020 12:47:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:47:50.435Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 13, 2020 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-13T00:47:50.487Z: Worker pool stopped.
    Sep 13, 2020 12:47:57 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-12_17_45_32-2442042179356823465 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f35ccb0b-2a11-46b0-b633-bde6ac702d52 and timestamp: 2020-09-13T00:47:57.815000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    11.619

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2020 12:47:58 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 2 mins 38.428 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 42s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/nuhhjcoofl2yo

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #987

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/987/display/redirect>

Changes:


------------------------------------------
[...truncated 279.24 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 12, 2020 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 12, 2020 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-RIUc7D-wT-3Xu-pSSnNbaoVWmG8qFsrrUtIlGXCSc-M.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-JOGwniywT7hH4scTaazL5e5CGe6HLQjWve5gadGBatc.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-3TDCx_Pcriv3b8nmWAAy5zF7p2PNpVhEQ78_wjtIT7o.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-ecM-cjZHsuv-vLAdNMJsT05d3vjpOVRghbpp0xHfhoc.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6190156969055888399.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-iPGL1gHZB5hn36EPyFWerbiLaDfXxWsXpgUr3bqUOuI.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-3TDCx_Pcriv3b8nmWAAy5zF7p2PNpVhEQ78_wjtIT7o.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-411spltmxUvPD3u9EX-POIl3rPvOt4_uR7dm-DfjzvQ.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-UpDOCdEtfAQuVo77juDMMclhYOBtHd6f6a1X2O6WTPk.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-FT6Guez8yRRjdcifKSGNjhebxe6M7o4hIeTBgI-xZIo.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-mV0N9SV_LwY-3pfTMjTzcwHcuuO3eSNpB3KHNGe84l0.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-n6XOhc2iAq9zOePawdseFU42CWbypMPT_ZElyDV4vHw.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-UNHdoIG4fNDDhzjkm6qJ_eIiwQrZ4e9dLpAsjK6t9MQ.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-xMCTfAN30JbgXY_daEWjMtUOrMNoerZ754VjzW-cgac.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-tQ1q-R8thiOp6Dn5IYHvFMqAeUDdvkdgX_6ApkN9B_I.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-DQmS06U8Fvxm7Gn87IeTb41ANociGGIOkFcgKB3KSAM.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-ftMHWMKbwWwxPnHkF6wbAfJXsyIVZsmuOTaU2iNwTJE.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-MR0nrPmMOTriTglSOWSASq6pZT5cw-hdETF5OTSoBec.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-4bY7c9djIvf77Ohvq5xwnNHNSaInJFk4Lrcx7IPqgrI.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-9hYTxpQnOSZxMvlcUNmuPbrMchDYYptfb_EkQsR5k3k.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-osZtRfKznYFw_JlDkTISSmC4hifh3JChHG2-T13pZes.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-WhmxBKtp8f-C3cGV5e12Rsln9Btl7v_eaQywoKYvKlk.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-n_AXw0VKyif5jyBHKfPrfUbMLLEBUHZDl1WkZQBCf9E.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-OOUvlm1vQloTVR3tnCJ1l0BHpEuFBgoPlQaoAmezOCc.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-UktcsxXvOfsKiBeyn0cHIzw4LZrrR0536Ro9fLlvYyU.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-impoD8M75cj7CpGL_PyURf7JL09AWCNSchxAigXnewA.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-bclA7dAD9g3Jj7O7BtKLIrJfqbzFf7nbnrDrgC2Fj2M.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-nsBGxwGbmUzGEAKrRArLP3lftpkZ_afNbesLJApdmVY.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-3RO2ob7iNEzjbqzR7r2Xt-B_GqKniiJppug_H3fEuk0.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-YhPkReUb_NrwpSFYgWvYafPWyxQ2GKkuEFDYvJcz-_c.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-q9XYvHS5ZGrt0fWTG77xv9npC79r9Pjt633HCAVnOEc.jar
    Sep 12, 2020 6:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-EzPogaV70wP_A03P-JRUVrtHfL5MYfGaMLFMry0aO7E.jar
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 0 seconds
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash b4da06e9077831184b032eee574a659454966fefdd15dff5e55e10331691ed40> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-tNoG6Qd4MRhLAy7uV0pllFSWb-_dFd_15V4QMxaR7UA.pb
    Sep 12, 2020 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 12, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-12_11_45_17-18202781121867523875?project=apache-beam-testing
    Sep 12, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-12_11_45_17-18202781121867523875
    Sep 12, 2020 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-12_11_45_17-18202781121867523875
    Sep 12, 2020 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T18:45:17.576Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 12, 2020 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:24.723Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.377Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.418Z: Expanding GroupByKey operations into optimizable parts.
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.446Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.498Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.527Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.560Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.596Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:25.983Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:26.064Z: Starting 5 workers in us-central1-b...
    Sep 12, 2020 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T18:45:51.325Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 12, 2020 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:45:56.439Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 12, 2020 6:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:46:14.063Z: Workers have started successfully.
    Sep 12, 2020 6:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:46:14.097Z: Workers have started successfully.
    Sep 12, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:46:49.517Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:46:49.668Z: Cleaning up.
    Sep 12, 2020 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:46:49.751Z: Stopping worker pool...
    Sep 12, 2020 6:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:47:41.359Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 12, 2020 6:47:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T18:47:41.400Z: Worker pool stopped.
    Sep 12, 2020 6:47:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-12_11_45_17-18202781121867523875 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1d7a904e-2477-4cbd-b44d-81bd21923c91 and timestamp: 2020-09-12T18:47:49.267000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.634

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 6:47:49 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 2 mins 46.715 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 33s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/zkm24aezxos36

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #986

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/986/display/redirect>

Changes:


------------------------------------------
[...truncated 278.99 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 12, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2020 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 12, 2020 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 12, 2020 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-z90nMHxJysgy2fVqLL7dSxMRTpaj_DkOkaKyTm5AjiQ.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-xjpsY1CDdiNH4GcH0IosGDFe5-fc0BpbWzAYkDaBXDk.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-nODW2BB8MfYe49sXEMO_JDDOHJkv5ZvqD-5t-C5COo0.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-CJNWcV72-nHwEc6so1QNIPwuBMgdbbrvePhYIkqxEJE.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-VzbQGxA-iPL-QW8hf3xGYCCcBNOLEn6n4Plo2-I8c34.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-bhFZYA0yTghTs8BwfiPAw_igEJLTzWIggb-wb5eO4dY.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-LNxTYSuNZ7QbKSkngFHAGYM5-tQaEOT0uOcUs5IypRU.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-3_0ziMRMnaat9w_fu2QFfDZCQWjcqMh68i9al9qPLDk.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Py3cehl7iScRQtQHX6YrscKeUxy12A7PoEbe8zKdbCk.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-z90nMHxJysgy2fVqLL7dSxMRTpaj_DkOkaKyTm5AjiQ.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-4wMWeARapQtYGfuZAmT6avEPNlYRFwfopp_gV2-jPnU.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-g8a4J2UgngcLcnCBbJqF2L5A9ZvPf96pvOHqb94wikk.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-V4wdv-LgUm1EjG9OBFD-pNMIFORqO13fF1W7EWNtfmI.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-MUoq4Th92I1RO1GjuRCGXo0oQySg97GPRrp2cXIOj1o.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-RpPlxJ7fHCY1KSHW2z66JAeNDzK-VNHaBazOUXU7LP0.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT--norSEYy8lHMxHHBJd9Lg9fOih6hLAGDT-B_tY-TmUs.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-VzvJl3bkjrKRxpDIF4GDVI2OM8Nhyw_-7hRP8yQp_Us.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-tmdEIuqQHxsfrSq96AKHs3FopvuTnJmRN1Lb07eSK7U.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-Y_-uV9q90QrmP6dSA3S9h5xtI50CDXKyguWJKFlgmcI.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-_YIbCxfA5XOyMVo8kquLO4qZ12fnMRBe370ygweFVFI.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-xzcRqjAjnRJqLvLaM7fFhb3v5jRY_ltSAzqPHWIJq_s.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6928200970980496070.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-VVAkN4VVwgNkWFZPVgeiVTlFsBkPZkxJlw8J3hLAx6M.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-7pr-N1rK0dP8R1615EguLH4hV6Brba2gsrh1Tf_YX6Q.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests--snj8XHUcb4kiXHNuVfyh6Z7QhC-d6mM5HW6P-oUHUs.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-lG1-tl5G4gHXARL5ZlvTEwCV0ZZzGdHU6Ykq564C3vo.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Yo823blVKIewudqVShpCEA4hC9XW7xc9v4Frj5ESu9s.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-VYBl92V6ffH6vF2JaZAiXMvmaEraZbVDF1jgMA2q6DY.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-QoY-pks85DnX2JST7eBLKflvwzn-bmdKxDHkzn61JtI.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-LvmNivz44mdZAG9a5pJhw9B2xOwU66F3BPOjHJTGwr4.jar
    Sep 12, 2020 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-cZig0-PTcyW469z9Onvi7DPDcVsILcGnLup1u5RHN_U.jar
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-JhVoQ5LeV5Ln_TeNpduvnX56buwmbUsfboaChT29qss.jar
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash 56e9b29f7c8cea4e2786bbc07fdbf248902a4482ff6987356ff4b9025686fa51> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Vumyn3yM6k4nhrvAf9vySJAqRIL_aYc1b_S5AlaG-lE.pb
    Sep 12, 2020 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 12, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-12_05_45_19-16738107090139455381?project=apache-beam-testing
    Sep 12, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-12_05_45_19-16738107090139455381
    Sep 12, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-12_05_45_19-16738107090139455381
    Sep 12, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T12:45:20.066Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:28.861Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:29.622Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:29.683Z: Expanding GroupByKey operations into optimizable parts.
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:29.721Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:29.802Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:29.828Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:29.854Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 12, 2020 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:29.886Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 12, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:30.254Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:30.327Z: Starting 5 workers in us-central1-b...
    Sep 12, 2020 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T12:45:54.737Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 12, 2020 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:45:59.391Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 12, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:46:15.328Z: Workers have started successfully.
    Sep 12, 2020 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:46:15.360Z: Workers have started successfully.
    Sep 12, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:46:49.190Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:46:49.328Z: Cleaning up.
    Sep 12, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:46:49.421Z: Stopping worker pool...
    Sep 12, 2020 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:47:40.379Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 12, 2020 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T12:47:40.429Z: Worker pool stopped.
    Sep 12, 2020 12:47:48 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-12_05_45_19-16738107090139455381 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c30848eb-5256-44c0-bfe0-063b522d740c and timestamp: 2020-09-12T12:47:48.099000000Z:
                     Metric:                    Value:
                   read_time                    15.368
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 12:47:48 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 42.057 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 31s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/yi5bd2nvpocyw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #985

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/985/display/redirect>

Changes:


------------------------------------------
[...truncated 278.98 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 12, 2020 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 12, 2020 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-pn0SwLdkI0pahJHYDBxPOgX1WZ4uDVZYjccQYpo7oeA.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-pn0SwLdkI0pahJHYDBxPOgX1WZ4uDVZYjccQYpo7oeA.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-nHH4ftcX39NARRKedGQ-QpjNjYd4a11pfxciViCe6WA.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-jO6kSsJON2GXcYBaUa0Tj1Gsbic9VbXIGiCQp5OTgzo.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6718575402158905871.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-BimeewIml3XjtJmxQN6uSY1w7XOznK23PThpumby1_k.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-HRvyD_NXuSL1ch2rh1djptd_MzKnU-qC4UPqGIO9okw.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-vJH7mzl1AJV7bcuyyNsUnPzyz5Sr1kutpVZKRUNJwZk.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-LVCDZecLiZUzatl6SeUZtjbifFr4sbkibKSH2v4wRp8.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-ienPwK4WHmGnmaI2W5zWYnMwUKc-GzXpnZ9mOPXoCrY.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-pftuOscHaO-bYIyABe4Cgm8kf6N_yF21i3E8OyI-pjA.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-pll8XQujkQm9xci3NAiDCanQthD5_p6wb3UTP7tZnhw.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-ThKXZH1IShz3FlUXCOcU0lHq7yEtECCxlmEKx0JbDbc.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-znClMJiYPgsaa1MPlqbpSoFgjga7d4s_UlyNS2vPGJk.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-Ef5uCdly_r_tyj8K959BRBZGC2j7qxY5jReJGF7SDYA.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-gbeMcPIVKrLUtGT8t5XVeYyf26ImW84NjGLoXbtWaJQ.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-v_QUL2o2QxfjsYRTwhs4_Dv9P_Qbb9zKRsQ5PFgNgNY.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-DJSeyeXtpfB3J-QuORA_xtu8adBGltYze8ETKXN0Dkc.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-yM-uNbxr8e_3PV6fkCGDGzPlzdKI6RSzau9j2E-TbRI.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-Z5vgNRtS0P5VaUMfWbOZ3kwQYsKdW-y5FvhgYNBI9gw.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ZaNL3XsficK2_lUHe52wHDbaztA5Fc6F8op0xv1J_ms.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-JEYjAyhR2tyFC7yzbcB_H7SviuT58WmTlTxoApBpuDE.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-4bbGL-aDTVtvbsa8AJLKyutXZVyddVbvOPACXqXUbs4.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-AG83rGpAF7JL38S0edVG1YqS08zkx5gB1Tp5Rg0cyEs.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-EIFlkV3SAZ5kO38OMLdXWaicV_bmQtCnHmm9B5ZYb6g.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests--MvoIWxENv9JEybVcy6KhN2VtHJYeK68Zcs9qm4-MxE.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-2TrPgaLA11Wd0pRyaGoipPZijJfucL_8fEwEQBaHfEI.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-TCQyT7Q7fhsaY5KSEh0wXpTX33tu1xTk4ku1dYN3Pxs.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-5Dr7fPFgXEq03Gv_n-BVCW-xsnmiyk3ymOhsU136Mpg.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-g4V0UXIw0Gvxub7uRE3j2bGghZ2qgFLlVBgBVfJwVsg.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-uVDh8pvQQs56hO8yf5xz7Z5nVAo84SqJauDB5ANudV8.jar
    Sep 12, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-1mc-rbh_zA73BkM3NqhCDU6WmbZ1ssaTVhtCxao4rJ0.jar
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 189 files cached, 30 files newly uploaded in 1 seconds
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash 04195b9a11bc6b4afb18554514516387d16d2121cc710331485ca37a948f8f20> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-BBlbmhG8a0r7GFVFFFFjh9FtISHMcQMxSFyjepSPjyA.pb
    Sep 12, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 12, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-11_23_45_16-13117806937219269478?project=apache-beam-testing
    Sep 12, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-11_23_45_16-13117806937219269478
    Sep 12, 2020 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-11_23_45_16-13117806937219269478
    Sep 12, 2020 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T06:45:16.818Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:24.680Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:25.404Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:25.442Z: Expanding GroupByKey operations into optimizable parts.
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:25.480Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:25.561Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:25.591Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:25.621Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:25.658Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:26.172Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:26.233Z: Starting 5 workers in us-central1-b...
    Sep 12, 2020 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:45:53.746Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 12, 2020 6:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T06:45:59.992Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 12, 2020 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:46:11.206Z: Workers have started successfully.
    Sep 12, 2020 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:46:11.277Z: Workers have started successfully.
    Sep 12, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:46:40.658Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:46:40.809Z: Cleaning up.
    Sep 12, 2020 6:46:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:46:40.892Z: Stopping worker pool...
    Sep 12, 2020 6:47:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:47:31.621Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 12, 2020 6:47:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T06:47:31.668Z: Worker pool stopped.
    Sep 12, 2020 6:47:40 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-11_23_45_16-13117806937219269478 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 27727e58-9757-4287-bf15-1f91fe943b0d and timestamp: 2020-09-12T06:47:40.597000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.907

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 6:47:41 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 37.963 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 24s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/x2m3zlfcocgeq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #984

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/984/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-10670] Add support for bundle finalization to DoFnOperator for

[Luke Cwik] [BEAM-10670] Make Flink to be opt-out for SplittableDoFn powering the

[Luke Cwik] fixup! Fix translation issue where we should never be translating

[Luke Cwik] fixup! Address PR comments

[Robert Bradshaw] [BEAM-10833] Fix type inference for BUILD_MAP.

[Robert Bradshaw] Guard test for Python 3.

[Robert Bradshaw] [BEAM-9547] A couple trivial but common dataframe methods.

[Robert Bradshaw] [BEAM-9561] Improve WontImplement reporting.

[noreply] [BEAM-10876] Fix TypeError in dataflow_metrics when distribution sum …

[noreply] [BEAM-7523] Enable KafkaCSVTableIT using KafkaContainer (#12826)

[noreply] Deprecate obsolete CombineFn.add_inputs. (#12802)

[noreply] [BEAM-10678] Split up assertion for clarity (#12828)


------------------------------------------
[...truncated 291.49 KB...]
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 12, 2020 12:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 12, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 219 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-LJ_9kaheq7w9BOODE3vO_VHU4msmNF498-oDmZDMkqA.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6336642198870915392.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-0qkYVZuiOlYMEtwEO0WgTBHUQd_o9bExGhX5sX8cLWM.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-ZkOt_IZ69LrXxGdhYGjGLrdcSIfCGBMwsIwuv07kBbU.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-KARW6fBHPnRXIlLEd1wYWPhWs8dOGcTVNyBpb7Olmpg.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-gFtayNgLjmvfoDs5w7U7gWnXFnjPbZIHw40GPxy1pmo.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-Yah8QZgL716wAz5TDj42xDmIxYFqCBbI5ktW3aXI9jQ.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-PwVMF9y-83UPxbi3SGymGSCsbwgGE9wFP1y__NhJ6IY.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-AppUfyD0dm2Crnbouj_ANpSfXmKQY3cmvkYyhZ_yDJ8.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-w3bl0P80QNKqfrsTJrr9YQxALjvB16ioQ7efG3dRbMQ.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-LJ_9kaheq7w9BOODE3vO_VHU4msmNF498-oDmZDMkqA.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-8ybF8Z2Kv0piFCjVRYEi0O-nx5tQRn5WlRb-r_FtGkk.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-LDP9Wiqq-yvlrW980XUcNhM5-Ucyk_IATH_OalqaIRU.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-bXbp07XzKUhn9HH81b9PQX7yr9vOVBMbHlR2kmIklns.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-3cYzjeOQ_FTy9lWrVVhsUAfry40ywUvdLlu-I25I768.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-_FwKQoGi2je8cUzB32PXtati5wiu7yFcTOk1gmbLrjY.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-1gXPKzTmtLLh3xWnFF8K43hCudpDH8uEyXgcbP9JnNg.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-VYxCfL55YVnvvW-Y6eUUU6xN7fKnI22QJfSWDdNyOeE.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-caeOSfNzkencZTebL_rnTCftyff8UIgbZqL8AwZpZzo.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-bYQ7mAF07g8pbHJF5VP_e7DzCUgls5_IlXn4raHwgrM.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-OJjBUn3KZdbrZr1cVWZlWS73xTiP1qYEf1k1Z_n6Sew.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-is2k5DpHDpuxSoFkASi5wiuKGCqZaYrDdYtBzm8Oq1E.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-mRNy2Ezainit0Ls36xdV1-UL55iTgaMp2SRXBPMGpLg.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-5iIRUowILCRQUCjs2IEmm7SAE8-tega96LAXdl_gcEk.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-WHwUIp9z6M5IIVNkZgY1PX-lJ5qugqD-M-eZlR3_mqI.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-c0wWcod0vRgzv4gonS8cX2J3KLbidJeKWHD6ctd6mRk.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-0eQpnwUWdKoG1qYl1EOKTFpBw0y3hDbnNe9b0AQROi8.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-uiXwkx4V_Q2jcu7W4v7IjNKN27dlUY-Bkb82SXlgnUI.jar
    Sep 12, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-usT10i7v1Au91jCtFD62bcTFrEbEyb6L91RpWqecJuo.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-Cl5kVpVBXNyWw__iSK5-awFBp15GhzWUhFNNLKGg_PQ.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/testcontainers/1.14.3/71fc82ba663f469447a19434e7db90f3a872753/testcontainers-1.14.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/testcontainers-1.14.3-pumfUOVGxIRJ3HrNRtmRWTZcp5hY59_klYGQF6d6puQ.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-ejCs8snBUVAmIW8mhAibc780K0D-vyOTVHe_5fv2Q48.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/kafka/1.14.3/b90885e30e86eb454e7b0e8e580cf59616e9de39/kafka-1.14.3.jar to gs://temp-storage-for-perf-tests/loadtests/staging/kafka-1.14.3-ITKoa6D-lUtLem0itUO_R5B5PqtFpvvOjJf9bTByVUk.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth/tcp-unix-socket-proxy/1.0.2/cf53989130986c60113032e25185f4496ffbc186/tcp-unix-socket-proxy-1.0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/tcp-unix-socket-proxy-1.0.2-KCMDGSy00Z-cM3qmB0jDR5sQSJV17Q8Xofxkoxij7SM.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-uxLECycFhDCOyFLuHXIWbZHuOMSnqDwqRC5PiRbKw7o.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.visible-assertions/visible-assertions/2.1.2/20d31a578030ec8e941888537267d3123c2ad1c1/visible-assertions-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/visible-assertions-2.1.2-RQSulosjfNzcto_1sHqmOr5JkvkHp3w9YSCqm5BBQBw.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.duct-tape/duct-tape/1.0.8/92edc22a9ab2f3e17c9bf700aaee377d50e8b530/duct-tape-1.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/duct-tape-1.0.8-Mc7xLd7JedH4bXz3CMQaF9pSPQXGhf1mQunQsq3bckA.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.kohlschutter.junixsocket/junixsocket-common/2.0.4/b4d1870bf903412533e0b79c6fcd402defcfc05b/junixsocket-common-2.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/junixsocket-common-2.0.4-r8N2Fez3-t_3TSmvtEP-T2M9OWZG2J2CXoIkoneDn2A.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.kohlschutter.junixsocket/junixsocket-native-common/2.0.4/726bd66a934dea39c817382986496fa4eda96411/junixsocket-native-common-2.0.4.jar to gs://temp-storage-for-perf-tests/loadtests/staging/junixsocket-native-common-2.0.4-92O4XsFT2VMJB0dOfyBspSsocDfXBLrO3aON1cTQ9gw.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/5.5.0/af38e7c4d0fc73c23ecd785443705bfdee5b90bf/jna-platform-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-5.5.0-JNgWIfgqwp_N2adBFgMfWQeiNDFY5hb0Vzu_okNK4NU.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/5.5.0/e0845217c4907822403912ad6828d8e0b256208/jna-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-5.5.0-swj66_5O1AnehBDgpjLRZLISawNfbqz_lo05CMr7TZ4.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.scijava/native-lib-loader/2.0.2/1451fa03954c5e31a358b411147de472b4dab92c/native-lib-loader-2.0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/native-lib-loader-2.0.2-5WfHHp8_9T94vVj9a6bUcc4x4SY_XofR4fzF0-2h4kg.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.jetbrains/annotations/19.0.0/efbff6752f67a7c9de3e4251c086a88e23591dfd/annotations-19.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/annotations-19.0.0-Ev8B7q8MCcamjy7AJLO_n6TK1uaLdLlov2LH91kEcDI.jar
    Sep 12, 2020 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 178 files cached, 41 files newly uploaded in 1 seconds
    Sep 12, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 12, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 12, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 12, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 12, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 12, 2020 12:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <95394 bytes, hash f35bf4a5a513c0bccb385ff729126089f5ac94a4f5d2ed9e647b74ed6a729e1d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-81v0paUTwLzLOF_3KRJgifWslKT10u2eZHt07Wpynh0.pb
    Sep 12, 2020 12:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 12, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-11_17_45_39-8814380002563364598?project=apache-beam-testing
    Sep 12, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-11_17_45_39-8814380002563364598
    Sep 12, 2020 12:45:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-11_17_45_39-8814380002563364598
    Sep 12, 2020 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T00:45:39.602Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 12, 2020 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:50.578Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.262Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.295Z: Expanding GroupByKey operations into optimizable parts.
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.323Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.398Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.425Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.461Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.495Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:52.971Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:45:53.036Z: Starting 5 workers in us-central1-b...
    Sep 12, 2020 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-12T00:45:58.703Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 12, 2020 12:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:46:24.546Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 12, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:46:45.513Z: Workers have started successfully.
    Sep 12, 2020 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:46:45.546Z: Workers have started successfully.
    Sep 12, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:47:20.484Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 12, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:47:20.644Z: Cleaning up.
    Sep 12, 2020 12:47:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:47:20.733Z: Stopping worker pool...
    Sep 12, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:48:13.271Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 12, 2020 12:48:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-12T00:48:13.320Z: Worker pool stopped.
    Sep 12, 2020 12:48:22 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-11_17_45_39-8814380002563364598 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5a924cd9-1e61-435d-b54f-0b537a85a72a and timestamp: 2020-09-12T00:48:22.838000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.322

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2020 12:48:23 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 2 mins 57.425 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 6s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/ugihktdnc6lnc

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #983

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/983/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10874] Support v2 Go protos. (#12816)

[noreply] [BEAM-10824] [BEAM-7654] Use deterministic hash functions in

[noreply] [BEAM-9561] Fix issue with pickling in doctests. (#12560)

[Alan Myrvold] [BEAM-10868] Fix build (docker run) concurrency issue when building

[noreply] [BEAM-10009] Add beam:logical_type:micros_instant:v1 (#12764)


------------------------------------------
[...truncated 287.15 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 11, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 6:45:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 11, 2020 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 11, 2020 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-gMchEEhbxaRrNf1P5xcPsICu4oMIsJ-XW2cmgKwFatc.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-eBadOFA0fqv8G9L2Wd3Um477d0ki09Ob4UJqszX16OQ.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-gMchEEhbxaRrNf1P5xcPsICu4oMIsJ-XW2cmgKwFatc.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-t2eNpXQCgZwI5KgYP5VZf3jqeaBJjebeqfwEpjC0-jY.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-4AO1reP-sV1oEAvNZb758j2Uhv9BqgRWjDCZNEgm3lA.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-Q6quusJltf6lfv5SsIXp0JCi1Sgq-MTAeQfqlrdsaYs.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-peZ5SYTL3Hqnq6vbGTtTswa0JiWBPJyU0FH_zPWcwq8.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-o4iBu5jmn2LrPwi_ds7QAXQ-pyvqAMd-BZWJ72s9u0k.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-fJSqknmdlL2NoQzJjJbaXLf7IeBLX9wg4vPX9PQjM2U.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-W341Jh_3oED0XPXotfCXz2afWgRCJNAYLdvMEtt7WyA.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-9klyhcYbdUxdyuDJqnl9gcRjoINslbLG1r4KnXqvCno.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8867722970680486480.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-7L3fLHt-HsqBCCDsrWHqFbTpom5tkfMcnG05HXfv_9A.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-k1ojP9Y9cNG3_qpXqEftM3hDyX6RDoXII_ebD-UO790.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-vHM22C2ymmgYRo-CAFvdehkmkbG3I12as-CfyYuKobg.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-uahFEsj4yV8_ydh1Cee6DLiXAJ7CxtvsLjfLjwSFofg.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-5b9QI-W_e5BUDfLTwbh_64SDt5e_cXZfe-S9Pm6tRQg.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-g-JggqFeTaBMk9G6BOirjczISokGKmUX6Tpt35ACpXM.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-gLY7oc72_7_e5LRSMleAIVz_bSsa2IkMkxjiDdWciMs.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-M_A8Uc7DoqhCX1DswcSeVWLOdoVZuCNKuW6dEJvjwS0.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-iS3WTp25igCIdylCP65sZBnSutjSymp8YHoJN2k-1mk.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-CwlG70WDekWDWbbxkYBZ0rP_ZNFZN12C1VoDwkjh--Y.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-d5RbX3wLHAUPtPoCPgpISx8MMy5RVnWPR4XnqQDoKnM.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-8cn525-een-8dZi9RpwqimzKZlmsGH411tQ1zTZ0cA4.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-gg6Iy6lgyZu9L96cNue42RURw2ZUZgRBa_wA515Kpeg.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-GeO4IwTNrKdurrbYGb1rK1ODkEnDl77ZLygEVYK4qhQ.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-QVlKnfLaL6006khliamBpKj2KquSuLttRKvC4bUc9T0.jar
    Sep 11, 2020 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-yvK_oVVhcbLLQLH2R2tP_8SaMZ8WJdEFhc8MHpgekRY.jar
    Sep 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-IOsh7pW4-j0MYApu8-RZVQ9wzIMv7HYIPrw_1NgLOso.jar
    Sep 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-_6JpkNEB7vq0PjxapkCYFRoQohrt7PfzoSI39QXu1xU.jar
    Sep 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-Ixbo4tTmPCbm93rknuAmlW9wm20f12iYregH16UFDY4.jar
    Sep 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-V80DpgZ5XK-361QcHpIMn6ZQ0TegcVhn-kYG1H3EA9g.jar
    Sep 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 11, 2020 6:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 11, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 11, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 11, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 11, 2020 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92071 bytes, hash 24c825147da13f35d03c06abe778da2e646a36d43a6e2bf4065954e68937694e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-JMglFH2hPzXQPAar53jaLmRqNtQ6biv0BllU5ok3aU4.pb
    Sep 11, 2020 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 11, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-11_11_45_36-17339166121582903736?project=apache-beam-testing
    Sep 11, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-11_11_45_36-17339166121582903736
    Sep 11, 2020 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-11_11_45_36-17339166121582903736
    Sep 11, 2020 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T18:45:36.344Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:43.383Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.153Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.196Z: Expanding GroupByKey operations into optimizable parts.
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.237Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.314Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.354Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.389Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 11, 2020 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.423Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 11, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:44.852Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:45:45.039Z: Starting 5 workers in us-central1-b...
    Sep 11, 2020 6:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:46:16.978Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 11, 2020 6:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T18:46:20.825Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 11, 2020 6:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:46:39.841Z: Workers have started successfully.
    Sep 11, 2020 6:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:46:39.876Z: Workers have started successfully.
    Sep 11, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:47:08.746Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:47:08.901Z: Cleaning up.
    Sep 11, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:47:08.964Z: Stopping worker pool...
    Sep 11, 2020 6:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:47:59.014Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 11, 2020 6:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T18:47:59.057Z: Worker pool stopped.
    Sep 11, 2020 6:48:07 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-11_11_45_36-17339166121582903736 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 16ca54f6-27fe-44fd-8fea-e1a574fc4229 and timestamp: 2020-09-11T18:48:07.108000000Z:
                     Metric:                    Value:
                   read_time                    11.094
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 6:48:07 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 44.079 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 50s
107 actionable tasks: 77 executed, 30 from cache

Publishing build scan...
https://gradle.com/s/ezh4jsl4ksqh4

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #982

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/982/display/redirect?page=changes>

Changes:

[nosacky] Update Gradle Wrapper

[noreply] Merge pull request #12435: [BEAM-10616] Added Python Pardo load tests


------------------------------------------
[...truncated 280.27 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 11, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 11, 2020 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 11, 2020 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Ae-f9Sheu2wbkzVMjhU3yuMeD46q9B4AN572q3FqU3Q.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-RrpzuKjX6tdnEEgoeH-Y52Xbsx-g9VWh8VwSDS1G8nE.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-Bqz0OrBTg1-gzj5rOdsvKvnbLIP3vFQZlqKf_aYtCEQ.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-eoARtvQ8xD38IYRnoQBQHk8gf9k7-LxlBbVK_qALrA4.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-i49DbrbEN_EF0EJNQGKAEkAP-2DgEBbBYFvGFhnuGmc.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-iRfh7N3OmNuIscFuMVAo2XpwtYnbhkpWs_0lFeFzVa0.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-qdrya5fq-KK9J1c0WClyeFwpFh5kbWPpxMpZsRx9MeU.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-18vI8T7FjKxmh898q8-a19MGtx-Mb9h6dgQPbGFfTYQ.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-SYDCsTRSert-DvrPCndDb-DBqRJy2H1nEZ6JoZ7ck04.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-JtWLeDuaTM_Xyx3rJIDqNTXdFBwta1DC1AjUAjX3uzM.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-6c7D4pNIrSoBOe-vtmBA8XJa2yK0z6A8dniKh06JpTs.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-IHIYYVFU0mFc46CJId0kRLNSlOAzqd3sXyHpKxKY7OA.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-5icT37eqIee1jpf9FJtDGlCIu9kupnL7Yh7wt0UAWJc.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-ZnDzBdSsnCzj6S0ZGkgpfLO7jplMuJcfo8ASiwHiRaE.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-Ae-f9Sheu2wbkzVMjhU3yuMeD46q9B4AN572q3FqU3Q.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test96809853339896008.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-awOOZ3_qju91NsiTwp5S6jydrmnTI8xkd0E-ROEMQdk.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-hZGsZpUChD-CIcnQb1C5bI2puzzSbFPl-oxm5kbm3-A.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-U_764p8r1QpTuCFYfY-lEtxl9ZJSjXSViVgqhb3j-0E.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-5VSasBxezAIjnogLb9FGuWNLDrb1RhgpybsaaaaiEoU.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-Hp4aPqnUT0FmswW5kyNUbhCSpw_-6GruqfvrZ3F1zAI.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests--74lxoyOIQi0xm0vs8Mqqna1RZRCRAY4AMkek0sNQe8.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-fG-93-yQPPNxI__sZBQWo6oIGM2hnAJ-yDs17nH4WUE.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-_ONWD1WHrJGCa8v-wOw6rDV5gyfCEUjIG9tcN-iyStI.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-WvxaJUzkqosZB2Y07f1-xRxpoafAG9-A7ZETM0fJ9tM.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-I5GaLJpazUJNu6niTzSWPKkGLOctdKMAj5AtQfQJMrc.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-sTKuC6BPGUoPoyHrvkMeUfBIeX8ldpUVRQRClk5K0T0.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-VxCGu6AS8WR3MJNSwvredsACzjgdFr7b_DBu5PrhCL8.jar
    Sep 11, 2020 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-gQSSwfJC8LnNIZYVahNDbAb6k8MUxy7p_GYamL1D_94.jar
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-FBptW6-9s0jhAjlmzeUzQZQ8sxKAz6ZQpMnqakAZmig.jar
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-xhtyF5wNVQbolMX5Ffp-wnPRKQVOjeHBt5d7r2rBTTM.jar
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-LNGEc6ISGTzpKpyUMekIeV4v2zcZ7272xIymG_DX_0w.jar
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 11, 2020 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92069 bytes, hash 885bf7ae048fdf391c390a6e1c3eb413159eb05f7184af122967ae716024eb0f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-iFv3rgSP3zkcOQpuHD60ExWesF9xhK8SKWeucWAk6w8.pb
    Sep 11, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 11, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-11_05_45_22-4646391347541778270?project=apache-beam-testing
    Sep 11, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-11_05_45_22-4646391347541778270
    Sep 11, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-11_05_45_22-4646391347541778270
    Sep 11, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T12:45:22.136Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 11, 2020 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:28.991Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:29.740Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:29.769Z: Expanding GroupByKey operations into optimizable parts.
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:29.797Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:29.877Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:29.916Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:29.945Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:30.091Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:30.444Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:45:30.534Z: Starting 5 workers in us-central1-b...
    Sep 11, 2020 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T12:45:37.267Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 11, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:00.777Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 11, 2020 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:00.807Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 11, 2020 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:06.128Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 11, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:18.999Z: Workers have started successfully.
    Sep 11, 2020 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:19.035Z: Workers have started successfully.
    Sep 11, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:50.387Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:50.566Z: Cleaning up.
    Sep 11, 2020 12:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:46:50.849Z: Stopping worker pool...
    Sep 11, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:47:42.221Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 11, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T12:47:42.256Z: Worker pool stopped.
    Sep 11, 2020 12:47:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-11_05_45_22-4646391347541778270 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7725d500-0797-4593-b222-85df5960dc65 and timestamp: 2020-09-11T12:47:51.387000000Z:
                     Metric:                    Value:
                   read_time                    12.158
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 12:47:51 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.064 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.037 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 2 mins 43.268 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 34s
107 actionable tasks: 71 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/p4pqkcuacvoq4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #981

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/981/display/redirect?page=changes>

Changes:

[noreply] [Beam-9543] support MATCH_RECOGNIZE with NFA (#12532)


------------------------------------------
[...truncated 282.24 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 11, 2020 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 11, 2020 6:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-_9Cfd3ryUawBsh3msigNEAa8Cvi10V8wJrscUSfwYQk.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-grRhjzORfFSbq5pz6zhNt5b2zFoC5xK0P25yj88INi4.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests--B1zo9QNCliMwvbRjzBWmZRL2WaCrLNdpNruYKFDW0U.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-qFu9OuvheSS45-P5pFZDFRzsUMxjS4jsPVT6h9IMwRI.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-Xo8iPnbB0F48b7d8k5TCJY0LYhj1tkpjb9MQIrWmDo8.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-MMZ7bA7QAggCus1hdIoSgun4WOFH_aVMR_nomKnRMzA.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8133321743153480524.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-eWCyglhyGERIMtTGYsGbjobXe4PuGxONbUcBkjmTxWk.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-BQMhGqNFQSEIZyOtbCLr8FFLErGS8N8h2RBbpS7wUkU.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-qSUTeufk8e_I0zfQt6OIgg7OyRNGmQXy4rVMCDPdE8E.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-g1FKzvEu1xwmB9oXVghpNw1dszE7Am7GaDmTn71Wdks.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-rVcQRS8_RbiOZDyRiGzdH8WyH24-Q3aJsSXjJsLr69M.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-aJvON1v8KfCMbwy14yBZocMEIYGJTrQRaRRgr9V4KoI.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-PqZ8lMBtgykrlw5u32ERpD-IuuE-7AqAtseeJFycbuY.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-B1vLv0BHOGojq8Q-aqTl8_YvOvT_uyq3f-BxH0qkUNA.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-voDwHFf0uMJRnjtL6iubdx6UEiFTHsAeGGDS17CAbS4.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-QWSFan_77uW0nUqBzH1nE4KidAvITMozko0nFMPeEF8.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-_9Cfd3ryUawBsh3msigNEAa8Cvi10V8wJrscUSfwYQk.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-e7OaK-HFcOtYzmZASvl7F31XOyHc7Vqdv_qEya_MJoU.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-7r30WNhrp1uxUAHK_8FKQvcTpO-JHVePCz3526fhBB0.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-J6l-1m96I4bW7SucJlEGKAgkfaEFxzELKLtty4jOpuo.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-0nS4qFQUaurprUnpqG_t6WheA5bwqMU9SOFtdmBmX_M.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-k1AjSlcJdAOSCU_fcTWY9DGSGOTZYFrD7qI064FaVnc.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-kI_qa3dHFGQ8tV97GravtY94gyuZoseTDFlQXK28f4Y.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-1OoNaZIcuiXvHbMTrSPM3s2vdPdGBEyzV5zBbYKVRMY.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-CNlg06FxFftf-dth9vVu2hLm3ip1JbTO0Z7jCosjTHc.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-UFA5E5gbUTTAXExYB45VUPpPE1lLdrHkQZDQ5yKqTsw.jar
    Sep 11, 2020 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-jk9kJjTYu-BV-iwb-DGQ9lZy5LSsqyA4vbTHHPIsJvY.jar
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-MDP9QiMokdVITk3kiybeLrH9w8jirlF_JzZXe89Sg_k.jar
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-glqEdPkCBXH7kz2UEFK_58-RFi2Thk5WLfoOhLRML98.jar
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-56cOC1vWx9OMJ9lnBwTIqk3Es9SfdCF71iEKSWpUYyg.jar
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-W1nt0iES5_HP-DyNhfW_FcLK0iDEy0mEBbVPCK45ccY.jar
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 11, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92071 bytes, hash df41dfea354c41904b5d163e0d7232825928225c7179d2ed2fabf9a25bfbc645> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-30Hf6jVMQZBLXRY-DXIyglkoIlxxedLtL6v5olv7xkU.pb
    Sep 11, 2020 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 11, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-10_23_45_35-7573820781821908256?project=apache-beam-testing
    Sep 11, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-10_23_45_35-7573820781821908256
    Sep 11, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-10_23_45_35-7573820781821908256
    Sep 11, 2020 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T06:45:35.112Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:42.759Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:43.639Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:43.671Z: Expanding GroupByKey operations into optimizable parts.
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:43.834Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:43.904Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:43.935Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:43.965Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:43.997Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:44.460Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:45:44.540Z: Starting 5 workers in us-central1-b...
    Sep 11, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:46:16.785Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 11, 2020 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T06:46:17.899Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 11, 2020 6:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:46:41.289Z: Workers have started successfully.
    Sep 11, 2020 6:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:46:41.318Z: Workers have started successfully.
    Sep 11, 2020 6:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:47:11.874Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 6:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:47:12.029Z: Cleaning up.
    Sep 11, 2020 6:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:47:12.116Z: Stopping worker pool...
    Sep 11, 2020 6:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:47:53.710Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 11, 2020 6:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T06:47:53.843Z: Worker pool stopped.
    Sep 11, 2020 6:48:02 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-10_23_45_35-7573820781821908256 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a317ed69-a16d-4d36-becd-a847dbd0b38d and timestamp: 2020-09-11T06:48:02.281000000Z:
                     Metric:                    Value:
                   read_time                    12.324
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 6:48:02 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.017 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 2 mins 40.082 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 47s
107 actionable tasks: 74 executed, 33 from cache

Publishing build scan...
https://gradle.com/s/zws2rkqjjchqi

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #980

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/980/display/redirect?page=changes>

Changes:

[Udi Meiri] [BEAM-10701] Fix Python coverage reporting

[Kyle Weaver] [BEAM-10762] Fix artifact staging bug in Flink/Spark uber jar runners.


------------------------------------------
[...truncated 281.52 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 11, 2020 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 11, 2020 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ieC58hUm98_y-GGagrXdEkZqtvQ2l5YRZRLzJCrmEQE.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-yNiqRMgHr6NJXv_AVCStjnG2DQz9nTOWMVgwAh4bEd4.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-MwSOUj-MyNtQWa-5aX0riw_6W1QdXWnC4CEDKD4ATfs.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-hFo60eUWGUA-bJYW_peVkOhDLNS3-p0RnHX8OLZlzKg.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-F4xeCjVy7_-EVdlzZ7E9NfbBStEpdqBQY9NfwaoAr9w.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-XWvWQjYfRx9hpeQq3OLzXMxFqEna7qj5yn8PgPceYXk.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-swXMvVBmRN5BhKpVo2PipiNsvEtJCdfzQZ9NdiuXMZE.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-O_LNUBkpcdJR4kPf4jmFBQTg9hAVP7FcSfEGbZhnCaU.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-cOzH1nlhBvi_4eJGfWdP3OIWpAzJBorGXRzpHJm59z8.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-2gV0dJrWb0Vjd0fall-eH5VnwRbOtEDyprMPf4dLg7c.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4667017373377776618.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-KZTFHs5qPcwf4iNogdV2cDlXWv6eicRxr_KM8KxFEsg.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-wMxMW2gtNf-YiJLjoRnbobjDnrJJjivBF6xvFECPxd8.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-2E95Q_9wp5aRg4K7l92AnMVk_pxqhi0BEUqr_MAdvBU.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ieC58hUm98_y-GGagrXdEkZqtvQ2l5YRZRLzJCrmEQE.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-eUtrGBQNP1KLDz9z0O3L0PYQCWaKv28cQLf6MuXETJE.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-HHYUr_S25aLAbLc9MfF1RENP1rUfZ4j0ne9pwqUaghM.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-wvnnRz1CeTGtTHOJN0PafoY5P3kiE3gxpDukjIYZ_ts.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-2Yxux775geaF8obh0MKUaRGlds8vwUbkqEUYqpVU5P0.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-BYxr2siFjGR85CyNdPlR7nOtu2B_a8HzM-D8y56vqr8.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-pktgizmo61FVknEbPbmt_E7NHdKoEPmpkJ8z4UUkZ_c.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-8PX9jEMWSTaJRHTSAdbMiW21wQEBDXubTB1MJG8WNps.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-TSNfr4W1eOjvUri3qrIjTZZ167t4qz3N2NEW2A5kQz4.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-Gm5KVx8ok8uraHFXhVclw1cWP-yKKUnhkdzZfHT6bYQ.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-JPL5EzLUoK4qUZBSPW2agNWlMCcB-Q6_J7JLco5PzzM.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-728kO6QeW-vGeZ6lDMCTbIFNlgA5TldktwRhMv7bjDA.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-V5ouJnbHDDwOraZG_5sxKo4XmE46gpNetsSsgTPFYCY.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-367uDZRB-uVwnqegAZpwBS-vnQLcAQ6Xerne7kdBMNE.jar
    Sep 11, 2020 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-IxjW3mW5s8RqCzUwpcbuZdDj4UUoLDG9bHD2WB9FgdM.jar
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-M7qFJwEJMizpw1K05G4xxMlimiXhKp9KjfBdgscvn5c.jar
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-n0qyaDhmefLsQ4aPvfbvdqTQjE-N0UDMO1mCqDMr7Uk.jar
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-MGBYUlK7mDrI7CKxMqI2iAoQhZ2vAZAPvQV-JCNsGDI.jar
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 11, 2020 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92071 bytes, hash c3e0b58032478fda0d5d96aab7010c2192346a57bd0305fcd2e2712d2690e043> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-w-C1gDJHj9oNXZaqtwEMIZI0ale9AwX80uJxLSaQ4EM.pb
    Sep 11, 2020 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 11, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-10_17_45_33-5855299628727253134?project=apache-beam-testing
    Sep 11, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-10_17_45_33-5855299628727253134
    Sep 11, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-10_17_45_33-5855299628727253134
    Sep 11, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T00:45:33.231Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 11, 2020 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:41.150Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:42.268Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:42.295Z: Expanding GroupByKey operations into optimizable parts.
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:42.321Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:42.398Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:42.428Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:42.464Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:42.498Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:43.092Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:45:43.182Z: Starting 5 workers in us-central1-a...
    Sep 11, 2020 12:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-11T00:46:10.074Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 11, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:46:11.170Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 11, 2020 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:46:26.384Z: Workers have started successfully.
    Sep 11, 2020 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:46:26.418Z: Workers have started successfully.
    Sep 11, 2020 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:46:58.398Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 11, 2020 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:46:58.540Z: Cleaning up.
    Sep 11, 2020 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:46:58.632Z: Stopping worker pool...
    Sep 11, 2020 12:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:47:47.683Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 11, 2020 12:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-11T00:47:47.723Z: Worker pool stopped.
    Sep 11, 2020 12:48:04 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-10_17_45_33-5855299628727253134 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 771ca346-c817-41c0-a46f-1b27e15250a3 and timestamp: 2020-09-11T00:48:04.489000000Z:
                     Metric:                    Value:
                   read_time                    13.742
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2020 12:48:05 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 2 mins 44.035 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 47s
107 actionable tasks: 73 executed, 34 from cache

Publishing build scan...
https://gradle.com/s/frxymp36dbfhe

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #979

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/979/display/redirect?page=changes>

Changes:

[txian] Upgrade cloud spanner version to 1.59.0

[txian] Upgrade gax_version to 1.57.1

[txian] Upgrade google_cloud_core_version and google_cloud_bigtable_client_core

[txian] Bump gax_httpjson to 0.74.1

[txian] Bump com.google.cloud:google-cloud-core-http to 1.93.7

[txian] Revert version bump of google-cloud-core-http

[txian] Revert "Bump gax_httpjson to 0.74.1"

[txian] Update bigtable-client-core to 1.14.1 (released version)

[txian] Update bigtable-client-core to 1.15.0

[txian] Bump google_cloud_bigtable_client_core to 1.15.1 (local build)

[txian] Add local maven repo

[txian] Revert "Add local maven repo"

[txian] Add local maven repo again

[txian] Update bigtable-client-core to 1.15.1-SNAPSHOT

[txian] Update bigtable-client-core to 1.16.0,

[txian] Revert change of adding mavenLocal

[txian] Fix test failure of GcpApiSurfaceTest

[txian] Speficically call out exposed classes (instead of an entire package) in

[txian] Revert change of SpannerSchema and will make it a separate PR.

[txian] Add a separate blank line.

[kamil.galuszka] [BEAM-9456] Upgrade Gradle to 6.6.1 (smaller version of #12568)

[txian] Add TODO in GcpApiSurfaceTest: TODO: remove newly-exposed clasess once

[Robert Bradshaw] [BEAM-8893] Fix issues with state and multiple workers.

[aromanenko.dev] Update doc URLs for RabbitMqIO and KuduIO

[srohde] Fix data races in BCJ and RecordingManager

[noreply] [BEAM-5757] Updates CHANGES.md for ElasticsearchIO delete function

[noreply] [BEAM-10701] Change coveralls badge to codecov (#12768)


------------------------------------------
[...truncated 293.02 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 10, 2020 6:45:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 6:45:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 10, 2020 6:45:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 10, 2020 6:45:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-0WEIyarDFMQOgVVqWiYp2F7VpnG6Fxii480LAFon9mU.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-gfRbazVn-8pD5Npk_tABntlQaUktjQvp8WUW8_Qjjbo.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-fO1DONJRbRbyiwfU-ohklP63dpjdzBT3E9JKgwfUTwY.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-U_aL5a9pcLxwBxCREhg6dHkIraZJd5OxEJ7qw-KLY7I.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-IPIaEyYFK6O7F8xDFkCbE1S25Ww2Qdok0JhAQXquA80.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-uN-VGOlSj4l97hwBu4uOy-MerHzdglcWK2eTk1FOhrc.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-pE_pWjWLYj3EAJHbiNa6kTqyoYbrfJUq06BL_02cnWk.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-Hat2RYrzChC-8lnD9969Q0f9QUDSkZjQzbhOKXSRWg0.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-xPFZ_ahYOGy-T3Go11fZpKbgX0g01DDIoCD70YghAhs.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-ygDQSgSv_fEXzmjtOpXEFGZh6oC0KUge-g8RT3U9cwM.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ylu8RvH5CDVyFdfyeCOD_XeCxFA7uTuyuKu-hqv6WR8.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tcZLEJWJ5NiIi4Pz5H2ASgIy0iSZedhUho9VvXc5Blc.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-5dGQdxbOtnakQaJMEL3rGaRyi0Ntqsxu5qaT05shMsI.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-ylu8RvH5CDVyFdfyeCOD_XeCxFA7uTuyuKu-hqv6WR8.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-rPqo6N2YwR3JKE5B6TUi9waW4JA4xK6qXsSbAIDhe0o.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-ynPBJ0d9WgCfCO3DRzpylxXRd5Ujt2w4XE61yWv0ul8.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-n-0TkJ8gzoIyJ-kuj-FchiKwG66axH6rlrHFc9hPrwc.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-rGsXwuaE0BG1L_4H1OjUV-S8Hqf1lTjb_M_diNuiC4U.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-7Sru_uvgEAv2Yu8KdLVrGIrUzPk93kIhZx8po7TDQbw.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-SSruovCV-lMDia47TS8oZa7Qfv4b_pUHoY2AIpsnhSc.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-vEibZ0ZgfELaCNCvSXRfMEBEwSmbdQbVgqwoBVSxP0M.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-MPPpUxW8WFCdmfCcLuPvITI9frmj1yYEFLVVrhL2j1U.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-lmSjZ4viXpD9qCY1Fyz4Ivwgn802NllhV9CVRLN73So.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-AKOLNVtosPXnO0VcBFW7It83AfSQZbJ04q__KztCaSA.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7050410037367012818.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SnNqWqOTf7Q4o14QYmAI_Z548PPCympo5hLrF0cT4cw.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-UewxIgW1qm6Tq400fgPfHX3iiRcb-dALAauphIroAJY.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-CO2HfwqpydQG9WVPMoz-DTZRTw0A_M8OcVeBjfFm2aQ.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-3EI3Sob0z1WAvwuii8FqDa804AIbLNjb2-h7Kwa1_NA.jar
    Sep 10, 2020 6:46:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-RbG2cOYr1dH30GOMesOjDuopIWvjSBWMQWpoHZCzq4Y.jar
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-3X09vT6ccpFBGNEfZzA86ceey6AFhA0QDSQKUHBJVXo.jar
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-tuMN9Na9L0jT0Ga4tY937Zy_HplZ99dDuLW5PRv3WFo.jar
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-BTxzgQLf2hcY68KBzzf4MkdYpk9m5jd-kShxDoBugS4.jar
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 179 files cached, 31 files newly uploaded in 1 seconds
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 10, 2020 6:46:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 10, 2020 6:46:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 10, 2020 6:46:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92071 bytes, hash 2e5a71d66a6aa763115b7acb4e8a3c04ec196f4f124280d7b6eba07a1451ca0f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Llpx1mpqp2MRW3rLToo8BOwZb08SQoDXtuugehRRyg8.pb
    Sep 10, 2020 6:46:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 10, 2020 6:46:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-10_11_46_02-10229227285528791390?project=apache-beam-testing
    Sep 10, 2020 6:46:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-10_11_46_02-10229227285528791390
    Sep 10, 2020 6:46:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-10_11_46_02-10229227285528791390
    Sep 10, 2020 6:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T18:46:02.569Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:10.077Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:10.711Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:10.787Z: Expanding GroupByKey operations into optimizable parts.
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:10.832Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:10.972Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:11.008Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:11.035Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:11.058Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:11.433Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:11.509Z: Starting 5 workers in us-central1-b...
    Sep 10, 2020 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T18:46:32.552Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 10, 2020 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:46:38.893Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 10, 2020 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:47:01.737Z: Workers have started successfully.
    Sep 10, 2020 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:47:01.784Z: Workers have started successfully.
    Sep 10, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:47:44.302Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:47:44.474Z: Cleaning up.
    Sep 10, 2020 6:47:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:47:44.579Z: Stopping worker pool...
    Sep 10, 2020 6:48:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:48:28.982Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 10, 2020 6:48:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T18:48:29.051Z: Worker pool stopped.
    Sep 10, 2020 6:48:37 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-10_11_46_02-10229227285528791390 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): be1e99a3-ff11-4186-8397-1f6fa5c8a672 and timestamp: 2020-09-10T18:48:37.483000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    22.518

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 6:48:38 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 2 mins 50.05 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 19s
107 actionable tasks: 80 executed, 27 from cache

Publishing build scan...
https://gradle.com/s/bjeuvz7hzlypw

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #978

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/978/display/redirect>

Changes:


------------------------------------------
[...truncated 294.72 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 10, 2020 12:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 10, 2020 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-bKZ--vXSc_lpMwP_g_ibuOP96hTvylQGxzw4BBNsPvw.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-swO3_W5btQ8suCbZYkThwgu0txNgKQM4ECIrG95LA4s.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-UreiqRyhKLY7vrfYm8oV0jM5QnqkcZTunFj8nBm4mAU.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-OR0gg3mlpnmWmRvykPVolmpE8fwCGPzuy4MRKyr3xhY.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-v5ii0caRXVPW7EAh0fi55E_xfJGJu-WmYM631f7_RuQ.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-OEVeHonuuR3_dEW6FEgpmnW8a75vF1JGa8D0FDQzQSo.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-GJlgYeu-qEsAHqNOwP3YCvWSLt5IVO3dbx7jeRtc8A0.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-bKZ--vXSc_lpMwP_g_ibuOP96hTvylQGxzw4BBNsPvw.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-g3ldI_8PO0lhdcrRgqY_gF2AgLcXARYUHpt8und40BI.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-2EobleMsF58lRm_tPeaOtdwkz-wN2J_6Xkaq26-ZYCQ.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5582218162116404072.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-NaUE-U4M601QqY3P-P-wZA6P2I4sS17ak9xM7O6P6J8.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-cvLBlzfp9E96om3fUdTp88e2Lyfc8VP9vycPs3JpBNk.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-AULZiL_TnR2690SPT5Br30vI59qQo6xRgw7h7dBZxYY.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-63fraZHs3F9MQL3Goevb90cxMTT9-XNw66w0kZYdEis.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-kloqnlk8DJw3AtP9UVjT9bxRazvoR6acDXUJ7gguNkQ.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-VBoLJtMq3RO8Q2ZjztdM4VaWv6ls7f3ozdKwCXHBR-w.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-4SpeeWzfZb5G66MqEQsxVa9da1wYpyc0-N2Dhz9SKkw.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-8T72Uemx6Cr8OrVOD_9H51CoRXxeal-oyru9wLEwmWg.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-Gvrgu-fsEfrMvrhn-avn7J3QvyXnEBpi9tF9TzufH8Q.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-l7lpoFFfLLRfZ6xzN-IS2bF7byLWCqlJ8cTLAaBKJ4Q.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-P9IGvbM75a99kwZ5-UCdRg93org2WxP0yT_7Fm9TntE.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-eUqCPlmWCTlHPBTg3JBVfqdyylgej0SCEUNZS-ObtUM.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-CnpGWYCWFQTwWeNq517iCyax0joMgujeqWjauxQfVtU.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-hHx1F9nUjrVjnN31-uaAAshcdod8KtoQTwDvVoSDZp4.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-y2vrRA93gwyQ-Vu_Guu5lYSdX8dQ7DB7Rv8858aBwoA.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-uAFqddBaBNkgmGta92crO300A6OuhWkFRhUUYOM-hKM.jar
    Sep 10, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-v03n7tTZ7kClh7RleYFIBdZZwsFIi7zD4vrCafwJsIo.jar
    Sep 10, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-MDfiTdtL7SjiYUHrTCVOZWgYpRBLmOr2i7X2onACPJE.jar
    Sep 10, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-J7Ij4QfdtZHV45DkV-4MMwEIz4qztF3BTgKeNuxxVvc.jar
    Sep 10, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-d8_AcpCc2kCFIdyRVTBpvbf0J_D8Zp9jjqvbzevZVME.jar
    Sep 10, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-SyzhaIQjE_j0dsJBOD_DLfZDLu9n2dnBFvrYusY6mhI.jar
    Sep 10, 2020 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 10, 2020 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 40f7295ce05905faa5eef0d54dd460e1c87154aa1f6259eba9dd3fa23c157fc1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-QPcpXOBZBfql7vDVTdRg4chxVKofYlnrqd0_ojwVf8E.pb
    Sep 10, 2020 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-10_05_45_26-12099994824242108149?project=apache-beam-testing
    Sep 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-10_05_45_26-12099994824242108149
    Sep 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-10_05_45_26-12099994824242108149
    Sep 10, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T12:45:26.409Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 10, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:40.695Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 10, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:41.493Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 10, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:41.525Z: Expanding GroupByKey operations into optimizable parts.
    Sep 10, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:41.560Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 10, 2020 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:41.637Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 10, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:41.672Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 10, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:41.703Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 10, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:41.736Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 10, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:42.070Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:45:42.142Z: Starting 5 workers in us-central1-b...
    Sep 10, 2020 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T12:45:58.756Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 10, 2020 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:46:13.615Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 10, 2020 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:46:30.505Z: Workers have started successfully.
    Sep 10, 2020 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:46:30.537Z: Workers have started successfully.
    Sep 10, 2020 12:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:47:01.845Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 12:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:47:01.993Z: Cleaning up.
    Sep 10, 2020 12:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:47:02.078Z: Stopping worker pool...
    Sep 10, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:47:42.965Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 10, 2020 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T12:47:43.015Z: Worker pool stopped.
    Sep 10, 2020 12:47:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-10_05_45_26-12099994824242108149 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0bbd0d36-716e-444d-b08f-7fd73f7923b1 and timestamp: 2020-09-10T12:47:51.630000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.822

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 12:47:52 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 2 mins 38.068 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 31s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/ev62opagvwbfk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #977

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/977/display/redirect>

Changes:


------------------------------------------
[...truncated 293.94 KB...]
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 6:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 6:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 10, 2020 6:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 10, 2020 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-VUwTsVJXAT4v2dq-0zOGrWuNKE5mnFF2FSS7G3Ymx4c.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-g9tQO40nexa6SGT-EN2q87vMgKPDjjK5xw7Yfa1qd-w.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-otUD7SDNHrlrsu84tduhAo6PDhYIQFlFCyUSjedzKSQ.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-TQ_1B9nUnmepfkYMjVNpb7k_8AMKgWiRl-6CONdYfvQ.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-MvskQvZlm2NL0cNiy16NyJvnvptW1r-WA3LDD36MYw4.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-h7gsBqHgFnnbP6mBl1L_3PovR5Yap6H0Ix39QN17xUs.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-VUwTsVJXAT4v2dq-0zOGrWuNKE5mnFF2FSS7G3Ymx4c.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-TKt8XCp1ZTRsaAUxF7b_5g-rzUaMJsxYxAa6BsXxkbY.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-j_8ZBV8bKS3CEfff6Ox10UsJ12FERQYQlUfWHrG0ayc.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-FIdkaePSg-roiTaXbsRH-MpKNoOB0_8nHENQUip1Clg.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-bxlnRT6BDrgd-xNcg1_a4NOZqpWoR_iE65j23teR1tI.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-LzfOsIPNbiDEYPh_r2cZXIeFpccsoL9M_Ux_Kp3xqwk.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-1DHU3p5MkVmBT3SYoLishgfhQ6OE2DfCJsuduWQ1wno.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-C5I0o9POTH0t04T8gxtB_y5TvW5GctbhUSmKc9pxNFY.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-SMu3P_It-g_M8f3ryIcpp_aACRFmbFChrsTxHeHGulI.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-LvaGEtv1IEKQZ2IRX8Rb3bTTIYgR9Jcq-V_qrRnF_SI.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-E1u7AJUQ_vagLBWNBR_bHUksG9y0LVS_RDXghCKEpOQ.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3742486861977036169.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-OyhEayZlIwxObvSn9-UJQuvk6b8m3sh9v8cFa0t9604.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-pG3NxuhV4gJmXPcVCuz2MI_vmTPhpJK1pL9Vcg4AECs.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-tED6mS84D7PDSN3SoyjnCQ_0wHyAcbOFGN7uAIiwYzM.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-iE8fXSAGu8wnGp_X6lhLh_ElAjDcw7SvogT5Z11XkFI.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-0VDBB7Bo4z_iS-RHiGkA3Y9h3oyvhH6V9VDX34M0aDg.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-6wNlYc-PmzpPkVmTDb5Q6QyNRFmnz1MS7PJ64icTah0.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-zhyAYLZlprGeoXmLlbBLJzvFDv9CI66qUTRKiDJu2o8.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-5VVS2pl-8INmkFYLOwb7qmkMKcS_qq_myvqD0Uxt1ng.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-w8s5Yl0eAXowbxmLWOMWrPRFvTZ4pcMoFWeizEZ8dNE.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-xriEgQ3AkVAGcZpNQ4tZ62Eze6u0541pew2flprBHNY.jar
    Sep 10, 2020 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-YlCjCKmIYmxM0-PnPN-rjddcxHp78DlAynfHhgXs2IE.jar
    Sep 10, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-Tr0sMbKnk4DRhv35IgzCauLv3vCcEZWyZa9QW9HFj8E.jar
    Sep 10, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-eh4C51dSP31OOzxn8iE__H5GpL51JLTq3v4cc4ls89Q.jar
    Sep 10, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-iWCq_URT5Wh-2ziLlJ21iI6Mtn9nmOBY3gEc66O5mRI.jar
    Sep 10, 2020 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 10, 2020 6:45:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92055 bytes, hash ec6f43335c1f63ca7ec3760aa25c943fdf35ad62dfb807f5f677807d475f3f4e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7G9DM1wfY8p-w3YKolyUP981rWLfuAf19neAfUdfP04.pb
    Sep 10, 2020 6:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 10, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-09_23_45_27-932784473032290779?project=apache-beam-testing
    Sep 10, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-09_23_45_27-932784473032290779
    Sep 10, 2020 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-09_23_45_27-932784473032290779
    Sep 10, 2020 6:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T06:45:27.397Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:40.271Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:40.729Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:40.873Z: Expanding GroupByKey operations into optimizable parts.
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:40.901Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:40.963Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:40.991Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:41.016Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 10, 2020 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:41.039Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 10, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:41.545Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:45:41.701Z: Starting 5 workers in us-central1-b...
    Sep 10, 2020 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T06:46:07.480Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 10, 2020 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:46:12.416Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 10, 2020 6:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:46:30.008Z: Workers have started successfully.
    Sep 10, 2020 6:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:46:30.045Z: Workers have started successfully.
    Sep 10, 2020 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:46:59.901Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:47:00.048Z: Cleaning up.
    Sep 10, 2020 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:47:00.125Z: Stopping worker pool...
    Sep 10, 2020 6:47:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:47:58.037Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 10, 2020 6:47:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T06:47:58.075Z: Worker pool stopped.
    Sep 10, 2020 6:48:06 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-09_23_45_27-932784473032290779 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 05114464-da91-4851-8612-09bd061edf28 and timestamp: 2020-09-10T06:48:06.535000000Z:
                     Metric:                    Value:
                   read_time                    14.395
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 6:48:06 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 2 mins 51.714 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 52s
106 actionable tasks: 71 executed, 35 from cache

Publishing build scan...
https://gradle.com/s/kru3chqxxijj4

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #976

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/976/display/redirect?page=changes>

Changes:

[jiadaixia] [BEAM-7925]add projection

[jiadaixia] [BEAM-7925]spotless

[jiadaixia] [BEAM-7925]add schema encoder

[jiadaixia] rename and remove duplicates

[jiadaixia] Modify description

[jiadaixia] Add description


------------------------------------------
[...truncated 295.56 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 10, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 12:45:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 10, 2020 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 10, 2020 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-R0-H23NttD1lPAAIRpqIiJpL40U6GGj0lSfajur1jaQ.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-T75vA3fLMFp7G0jIrNfQqhUQvDOfTvFv0BDTDAG17C4.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-R0-H23NttD1lPAAIRpqIiJpL40U6GGj0lSfajur1jaQ.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-P4LNbvUHXvH_iZU0pplMjxWHijL7okgvtKnowshP0WA.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-RHvmE6xUnaovnW1T42KgnPS0Y8tiWYyisI5HjFMU8XA.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-iJbeuw0UXNG9gC10M9AdyU1ArtSeh-F5TkKwEL7vtcU.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-MyAirtn59KFSEpav4dCT9sXRMw1BcUOBastDQ6wUH2w.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-Flmmcgx3NvWjHo1ncFCuua1Qsleh9RD1njS1xOrpL9k.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-T6Wxxk7EKhpJDYyh1aVHC5eO6_RRXVJnIgnVvOIEd8M.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-z0IRkf1abjjrMmlDuUFRw63fPwmzH5w9FkVKlx6SR1w.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-y6ldo4YBf6iqXGFpSYKsb7vaVWqNOpEtzP8rVLMBa68.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-jQv5gJYa36AzKM-p4slrvqJLLeIs4wf-MtX4LY4uBaY.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-e1MWu8BzLFZDkNKpS1uihYKPLsWOHydyvevez6rqC4U.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-cmJhTjHd2D6dz4jS8eTTv-jEdI_7-rAhbqPMHwBCrfQ.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-8Bu_ZBUw4ulNy4WI9pMjGI-RyQqCmnSjHQvr4fBnC-A.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-9c-oi-zBGRPJS-33q6oy_GYWKkQcnydCt7lCd7evIK0.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-6mzkacQ_Ca8Sx12C76OERKZYZHqwI991viKoA3V5LF0.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-FOcdLcq8PyDe7BBRXPS5Xm2crKDXbkObpyz5YW3emeM.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-VDx_88VswgjBUKMdx9yR8BSzpw49feKfD1htPefzcvM.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-bTfU-UL4HtQQ-74FnClCF5kkIwGbNxQ5FrKWc60_L90.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-LPFijZvDxlDGYeUORlt-pPChE-Or2lPLaqK_BEdt9ew.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5249225679430433749.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ArN4zFyQrRNnsnJQ60rFONJGV0EdMkyCzMqpGC__Gac.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-xdLQi32m6BlwApXBwrVv4KxliUljLTA2-UNtxsZRiCU.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-aridxMTwxSq5czx9PegWwRrFVE8BMFf-8YFG09BnZ1Q.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-bKKmgxvRTMtobiazKScpMLiB3yBjdddproMLP69j580.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-lWDPoE3HUZPKLoCzaR-EFLByT3xk_BTzTUEELOaxuc0.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT--ArfOt18QtvBbndx5xZVcH4PHOuTxWFyzSFqbWGTipE.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-ZoX8Da7UxgN1uKBW102Q_Xt0rNj0P-m2cyJYcXNMaJw.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-VD5Lg8Gu61rapePu8GQfjnkGhv3JeF9UfZMgenzakiw.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-y1LueXSHZr90YThR2dsnzpor7a6vFG6Fm94qPxtVGy0.jar
    Sep 10, 2020 12:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-vzvIKcmEC-qrxEuCmTgeZ1slP7VaxqDGR2N6YXZFWeY.jar
    Sep 10, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 1 seconds
    Sep 10, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 10, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 10, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 10, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 10, 2020 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 10, 2020 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 794422c3b772972ff6966fdd5c33ce33096730f8f1b1a374cecd5822dc3176a8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-eUQiw7dyly_2lm_dXDPOMwlnMPjxsaN0zs1YItwxdqg.pb
    Sep 10, 2020 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 10, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-09_17_45_35-14607885282040864557?project=apache-beam-testing
    Sep 10, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-09_17_45_35-14607885282040864557
    Sep 10, 2020 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-09_17_45_35-14607885282040864557
    Sep 10, 2020 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T00:45:35.109Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 10, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:42.851Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 10, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:43.833Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 10, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:43.873Z: Expanding GroupByKey operations into optimizable parts.
    Sep 10, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:43.905Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 10, 2020 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:43.982Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 10, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:44.012Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 10, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:44.165Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 10, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:44.204Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 10, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:44.583Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:45:44.658Z: Starting 5 workers in us-central1-a...
    Sep 10, 2020 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-10T00:45:52.259Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 10, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:46:11.963Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 10, 2020 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:46:11.993Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 10, 2020 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:46:17.378Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 10, 2020 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:46:35.306Z: Workers have started successfully.
    Sep 10, 2020 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:46:35.348Z: Workers have started successfully.
    Sep 10, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:47:10.396Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 10, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:47:10.559Z: Cleaning up.
    Sep 10, 2020 12:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:47:10.646Z: Stopping worker pool...
    Sep 10, 2020 12:48:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:48:03.739Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 10, 2020 12:48:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-10T00:48:03.779Z: Worker pool stopped.
    Sep 10, 2020 12:48:11 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-09_17_45_35-14607885282040864557 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 74765804-825b-455d-99f8-4ab6324d6e5d and timestamp: 2020-09-10T00:48:12.038000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    14.627

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2020 12:48:12 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2 mins 51.836 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 56s
106 actionable tasks: 71 executed, 35 from cache

Publishing build scan...
https://gradle.com/s/jaa7vtf6n3hpk

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #975

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/975/display/redirect?page=changes>

Changes:

[yoshiki.obata] [BEAM-10860] avoid dictionary size change when shutting down

[Boyuan Zhang] [BEAM-10863] Change encoding of Pubsub sink to global window.

[noreply] [BEAM-10864] Update Snowflake JDBC dependency (#12793)

[noreply] Merge pull request #12709 from [BEAM-8258] add more options and

[Kamil Wasilewski] Fix Python formatting on master branch


------------------------------------------
[...truncated 296.60 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 09, 2020 6:47:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 6:47:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:47:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 6:47:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 6:47:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:47:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 6:47:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 6:47:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 6:47:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:47:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 6:47:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 6:47:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:47:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 6:47:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 09, 2020 6:47:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 09, 2020 6:47:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 09, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 09, 2020 6:47:10 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-YICBu8Grzf9ZBP6j7tWsC-0GJDZ5oyUnJ3DsskfYUhw.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6544634911890019447.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-b50ifWgrih565NhpjBxNl5td-JJzUb1bb79phh1UANY.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-j9nh2O5mQmmjbh6n0TVBS95IXAzClgxNZseIHnVIOTU.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-rESy8CvF81t01JAwhWWL_bN1H1KcvkVrRFPhtrxVYcA.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-iPCA6BHQklNZng1R5i1GsYmf0bjvZKZh3ucJUgITlwI.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-0awVRcQC6KvnZNP3aq8p8Y3ZpodG5QcmKtTYKno-SdI.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-g-y4-RILke7zrCB_82SPGyl0V2VlYBdoyF10LHv3BXA.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-vszIle_c2zNkfom5DG7NXZtII_6DX1mHPDsmoP7eRY4.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-JmL4ryGFQyhkByNxPoObOifl5B_tQcQako9ODbUrJQE.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-QQg-6FuOsAvN7p3XxLnJD5I2DHOKKfgOg0ChvJdqQ0Q.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-MYI9N8ufg3GBrdIdGqPyOdN29L7gWxm99zc6753ywDc.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-R1RjFPIUrc6oKuFCPlpUA3QxUzMdQR_EN90Zshx8S6I.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-pe75t5rTVRfvkx0nORIuUEiR_JlQVm9NCLskl0hAv64.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-g4aj9lucu7gA5Ey2gN59GyZ0cn9RdEwmSdB3Zu0Q49w.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-3Bf_QdwsFOud7itIr0rITzUI1Y_lis9vif1xxyr6Xwo.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-KXFOJaTfMtnVyKzk2WP8fHKbAtQ1DBmBZHJQbitS2R0.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-HGnSys2D60zl0XFdKYPJNxK2h5RNeeNKpveescVsjZk.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-WEmDh1ECPtaZEdw7vWnmjaCbPe_Nplz9U8Lo3j1T6l8.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-YICBu8Grzf9ZBP6j7tWsC-0GJDZ5oyUnJ3DsskfYUhw.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-1wSkz1ZJT3f0xRK1iuS2s49PglCU2Z3NHSg36OAq2Lw.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-KQKFAm-Y386XDpeXwK0n-TMGV8zVKbFHxjAlfdkV9jA.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-odbrvcKNrIIb8i0N8O94xnzQ7q8AuJUZyNpS6Dehx-8.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-z5dfzKjZb7zJyXmg0HSYKzJAQT8eAdS04mq4qG5KCTw.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-Ai-D8pESDu9NxiioQiMX9mTl75szBbPnvf6Wm-XhF5Y.jar
    Sep 09, 2020 6:47:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-hwbULFSswaQo8lWrBQ3NrNHLnYqvvgeV8RkBLcZsrvs.jar
    Sep 09, 2020 6:47:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-opj6yJGAHV98_jEDmVRNXPuJgnkFr1K0ne4WfyHv-44.jar
    Sep 09, 2020 6:47:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-RlUuyS_WAnHngyBL0TDvvq3I1UrY19qR_sqeEYTix9g.jar
    Sep 09, 2020 6:47:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-eNSjz8AimCTDpK-kYFSucg_k2rVVI5NAliMH-KCNdOI.jar
    Sep 09, 2020 6:47:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-QEixvbmWsXPlxde-LTNtycgMD2TqMH6leH14kJQuunY.jar
    Sep 09, 2020 6:47:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-A2WYqZ6SSG-t3bMXTZyELU-moCz3GxDi7uWpWga0XHg.jar
    Sep 09, 2020 6:47:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-l7gSN4eMtymTqqWU857ISG6q0IF5kI3GkS95_tnHXX4.jar
    Sep 09, 2020 6:47:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 5 seconds
    Sep 09, 2020 6:47:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 09, 2020 6:47:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 09, 2020 6:47:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 09, 2020 6:47:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 09, 2020 6:47:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 09, 2020 6:47:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92059 bytes, hash 487b9512cb92b01bf9e00dba85d509e2bf9e3ead50500aa1fe68bffb60b88465> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-SHuVEsuSsBv54A26hdUJ4r-ePq1QUAqh_mi_-2C4hGU.pb
    Sep 09, 2020 6:47:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-09_11_47_16-1923284464522544724?project=apache-beam-testing
    Sep 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-09_11_47_16-1923284464522544724
    Sep 09, 2020 6:47:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-09_11_47_16-1923284464522544724
    Sep 09, 2020 6:47:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T18:47:16.726Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:27.421Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:27.969Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.008Z: Expanding GroupByKey operations into optimizable parts.
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.032Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.097Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.124Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.160Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 09, 2020 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.197Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 09, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.540Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 6:47:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:28.618Z: Starting 5 workers in us-central1-a...
    Sep 09, 2020 6:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T18:47:46.546Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 09, 2020 6:47:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:56.929Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 09, 2020 6:47:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:47:56.963Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 09, 2020 6:48:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:48:02.410Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 09, 2020 6:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:48:20.972Z: Workers have started successfully.
    Sep 09, 2020 6:48:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:48:21.009Z: Workers have started successfully.
    Sep 09, 2020 6:48:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:48:57.555Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 6:48:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:48:57.723Z: Cleaning up.
    Sep 09, 2020 6:48:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:48:57.809Z: Stopping worker pool...
    Sep 09, 2020 6:49:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:49:46.840Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 09, 2020 6:49:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T18:49:46.871Z: Worker pool stopped.
    Sep 09, 2020 6:49:56 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-09_11_47_16-1923284464522544724 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a21a6231-8f8f-4846-b567-7f878b281bb0 and timestamp: 2020-09-09T18:49:56.313000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     17.95

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 6:49:56 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 3 mins 9.942 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 12s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/mzwwku6bafhte

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #974

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/974/display/redirect?page=changes>

Changes:

[heejong] [BEAM-10791] Identify and log additional information needed to debug

[heejong] share histogram in a single process

[heejong] add tests

[heejong] add comments

[heejong] safer locking

[heejong] addressing comments

[heejong] fix tests

[heejong] get atomic percentile loggings

[heejong] add tests


------------------------------------------
[...truncated 295.04 KB...]
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 09, 2020 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 09, 2020 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-OdKvzRPQrYh5wlQ8eDU8m1SE9V14TR88wCcFRsrLE18.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-AXGgRQ7p98SLFkRPBKTLAUF-GOy6v43W4f9c4kX0qqc.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-MGxPRMGqw-JEJP9sQE-r1UA3iffcdgV5GQLZBcWJ1MU.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-8ZOEWsW_AwkA91gme3yK2EEJJgRkHwDeDrYIOKtyDP8.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-OdKvzRPQrYh5wlQ8eDU8m1SE9V14TR88wCcFRsrLE18.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-2iDS7ZZ2_Vgllp-GdBlBUzcr699n5XIuMJQ9f52wiO8.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-oMCyP7lyxK5qlQfhVikUEdY_yrTgPGvyjSvp5JCo5dY.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-WGfzeIqj7JLNTo1RygYk0zBpO_wv0-Kkzoy0R0pTs78.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-FhBg6ADxeRn-_PYs8pD4s6GJBIIpzEYaaTelHrirTig.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-oK_N73F3v1YrAp_azAI6Rmoz8D5yBm8Kqy7CGxde8qE.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-hoWA2DKVBj_4SNlYCYcyfXkJTqP7TVbTEMpwaSaz3oA.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-ucTDth7iTBG36nNjqZZJu2NRFGcp2iLSUwd-n_S1qr0.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-IblYMgPcm-HeEQkq8KynjqMCmRdIFzo6ah8LAnsBL60.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-aL_y_BbnsuuF2vkC61KsoJR9AghTK5m9NAYZTX-BlI4.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-x2dkbt-wVHHs0PabIIfroRMiuG0dv895e3pUyq-k_Ko.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-vWktW_K3gG1ak4kUJYW3GocyoT03efuUpzDjomiXgBI.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-kXbF6BuPfxfyF_eGQoQR810RKdnq782on5lR_Jo-UQw.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-p3tS5p-6-X1bTOloKL8HQq3_AXukBSRuuOLipDtG_J8.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-6RbmSOf-xBUZw-_Ifr71iYZ6iyl9fl6Y-ehFbKQxsk4.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-uiyFbq4LZfKcaaLGGBiAQR_qQqWE_bkXkXHiUPhHTCI.jar
    Sep 09, 2020 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-8tC0lKyDTFEdbNRuPy3Po-2hiiRfehtkRxyCymCJMzk.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-bwhZv_BB7KAbGk4J_YinXwbxWHNbahtRoNsyE4FDYKY.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-z4m3CL4Qp9_YzUuSs4nd43Dq0Z_GYaClKXNewTNeIAs.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-cT0ftUQUGqv9q2QXV11X8okwmABv40epV60-1VnM1_U.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2531929208300393189.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-MXmx_nt6SqSCd4EP1bk8xluk-wHAQaJIzeKWmmQeSKs.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-tsAAV7PcNdxukf9Vu7GY38PofGBVCHeM6fOakW0q1T4.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-xXhvavQInHdkvFiVtfBXj_rf71CwvZlrmzb7jUzUHdc.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-LSU3yFrdU2-84O2tiNefDkWJvCZR0UNJzI0Y94uIpWU.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-RhFoNMWgsNyuV35Zd-4Zp3zPGS_jxDrjIXZrFwngKqA.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-r1GsKTQts3pFw1H7fHV8n0Psde8emqFT5kgNpsO4TtQ.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-1kjWpwNvTIokn1Ttrse8dfWqiJP5TmLx8C7k9PMTmi8.jar
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 09, 2020 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 09, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 09, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 09, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 09, 2020 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92060 bytes, hash 5cfaceefbec389debaccda845b8408fb6acda2b16e433000e41572c305b696e5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-XPrO777Did66zNqEW4QI-2rNorFuQzAA5BVywwW2luU.pb
    Sep 09, 2020 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-09_05_45_24-7788045200651798350?project=apache-beam-testing
    Sep 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-09_05_45_24-7788045200651798350
    Sep 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-09_05_45_24-7788045200651798350
    Sep 09, 2020 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T12:45:24.245Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 09, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:34.400Z: Worker configuration: n1-standard-1 in us-central1-a.
    Sep 09, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:35.427Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 09, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:35.458Z: Expanding GroupByKey operations into optimizable parts.
    Sep 09, 2020 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:35.483Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 09, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:35.541Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 09, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:35.565Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 09, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:35.597Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 09, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:35.636Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 09, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:36.049Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:45:36.125Z: Starting 5 workers in us-central1-a...
    Sep 09, 2020 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T12:45:44.671Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 09, 2020 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:46:07.426Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 09, 2020 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:46:07.458Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 09, 2020 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:46:12.765Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 09, 2020 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:46:29.258Z: Workers have started successfully.
    Sep 09, 2020 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:46:29.290Z: Workers have started successfully.
    Sep 09, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:47:02.167Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:47:02.350Z: Cleaning up.
    Sep 09, 2020 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:47:02.438Z: Stopping worker pool...
    Sep 09, 2020 12:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:47:53.240Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 09, 2020 12:47:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T12:47:53.286Z: Worker pool stopped.
    Sep 09, 2020 12:48:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-09_05_45_24-7788045200651798350 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e8e6d2ec-5a40-482e-9eba-61fe1e23b724 and timestamp: 2020-09-09T12:48:02.969000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     14.45

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 12:48:03 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.019 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 2 mins 52.193 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 46s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/yceqaeclazxmi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #973

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/973/display/redirect?page=changes>

Changes:

[noreply] Support updating from a job in DRAINING state


------------------------------------------
[...truncated 292.83 KB...]
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2020 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 09, 2020 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 09, 2020 6:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 210 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-E0DGqBv4OhSM1jtLCJDybep9X6Yp0UqerWN93wt_mbU.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-VEXmG8wUbmix_xCD3JoWgn0wBJRnR5s5XGNCSCbGV9k.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-_c_No_hLNBMauIRky4YBlly-Ino7PqFaEyfI_GfCuwg.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-sC3XgjQ5qHQRBx_ywfUAaSd-R3xVCj00o9XPcufhwFM.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-kbUG4CMLpoKnIWKKrj3z8Od6e7fjhAWty6hJqJnUZHc.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test271759568709461049.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-zcLc0EPn4yceAlfSZVdRMB_6UR8kscviExOj47YRT3k.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-gQxHnxNxs6XjsPJFwUK31ZfI42b_oMFzyKlkBduHNaI.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-RjbtOnbaQl9_eJTzo58Lr5FoaDiqaC4875LvqXEyQ4c.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-j1W8HmKBsV_0_F5qjEq4RJTE9mflh35ZDVt3fpRQxnk.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-QnW5JGLW7WUuFR6ppgEdcun_vr_vhrH3b4qS7Rn35_8.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-FIbdluBN6KoDXceE1nJXASgi9eHe6f-Hn0TkSqpumDY.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-4Xtl9Nqf-nO4pdkkULkd2_csZ-nd3tA_6mMtG79BXUo.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT--D46nHJf3zIf76m43NisOpsnaoNJVBU76UBREOH_Fdc.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-OnHTRHIR2O-V09zsaBs3WAqHtVaKM4XhiD8sHN7WPs0.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-_yavBLvKSzaXNPO55LFPjlj2ZZ9Jb7uAuNbROnEobJc.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-bxNkTW3ViTIr4EYAlleUPC04350FFXEeTFiuKCXAG1Y.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-6yYl3O46aJUBu81iQCLrXIKMhAVHhTZVeN8ArVoeFyM.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-TeEvkw1zj6MMAojBk66ULTEEoTpjbfiKMXmsxBayG68.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-E0DGqBv4OhSM1jtLCJDybep9X6Yp0UqerWN93wt_mbU.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-fdmEqtq_uiiAVgr35TyowOUocdNs_meb-iKJpRmkQAk.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-7WKHfHF8X8zKyXNAM7hwbndpaaQbFttT2DBMdxLCo9U.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-PK_HTHaz3nZCH6s3WuhQhl1cEd3D-2M1MzEdEaJFVxY.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-EJq3BEmeJEiH3dE4wkwHpeDAPU2o21u-dIvfEHhSDaI.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-FftkW4qA5BaVEX4FFwJuduMTfWzEMhgu4JuGt3fAb1A.jar
    Sep 09, 2020 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-k9v53bvtfb-5Cfula1buWNyOKXuUXK3_oKqt7lt1Rq0.jar
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-BjU77KGNDPq0bMCzYH7zfav7hISM1KKot_jmWP0W4x0.jar
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-NHKqjh9q_4SzoDuXPfQJdYSerIlyS0OdaPZyXEw2ey0.jar
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-HPiUxpsN-kxJpoYSGXE_cwDUCs63SMsktEbtI_Xg1MY.jar
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-O8dsfVyPSa_6kVnisNYLj35t1C_hotwmrnnRblLAUeE.jar
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-h5pE_WCBJsWaJ-TfTnTwEqYxHUl2xX18xzWJ15zd60c.jar
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-fL6Cf7qc0oMhi3lHubxeyPd-sCWgtgVY9z9flL7b_oE.jar
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 180 files cached, 30 files newly uploaded in 0 seconds
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 09, 2020 6:45:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <92059 bytes, hash 67f19168180814e0ef03961b2cddfcad260455ffeab6f204e32ade05ca8aecd0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Z_GRaBgIFODvA5YbLN38rSYEVf_qtvIE4yreBcqK7NA.pb
    Sep 09, 2020 6:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Sep 09, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-08_23_45_40-11621385688081722282?project=apache-beam-testing
    Sep 09, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-09-08_23_45_40-11621385688081722282
    Sep 09, 2020 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-09-08_23_45_40-11621385688081722282
    Sep 09, 2020 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T06:45:40.289Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:49.070Z: Worker configuration: n1-standard-1 in us-central1-b.
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:49.835Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:49.888Z: Expanding GroupByKey operations into optimizable parts.
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:49.925Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:50.007Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:50.044Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:50.077Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:50.115Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:50.527Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:45:50.599Z: Starting 5 workers in us-central1-b...
    Sep 09, 2020 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-09-09T06:45:58.825Z: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 09, 2020 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:46:22.269Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 09, 2020 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:46:22.322Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 09, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:46:43.011Z: Workers have started successfully.
    Sep 09, 2020 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:46:43.048Z: Workers have started successfully.
    Sep 09, 2020 6:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:47:16.037Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 09, 2020 6:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:47:16.196Z: Cleaning up.
    Sep 09, 2020 6:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:47:16.294Z: Stopping worker pool...
    Sep 09, 2020 6:51:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:51:08.115Z: Autoscaling: Resized worker pool from 3 to 0.
    Sep 09, 2020 6:51:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-09-09T06:51:08.158Z: Worker pool stopped.
    Sep 09, 2020 6:51:16 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-09-08_23_45_40-11621385688081722282 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1cba1b82-be4b-4760-91cd-f6427a784fde and timestamp: 2020-09-09T06:51:16.431000000Z:
                     Metric:                    Value:
                   read_time                     13.74
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2020 6:51:16 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 5 mins 51.753 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 58s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/fqwfyydrzt7fa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org